31-01-2013, 10:39 AM
Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces
Eye Movement-Based Human.pdf (Size: 198.51 KB / Downloads: 33)
ABSTRACT
User-computer dialogues are typically one-sided, with the bandwidth from
computer to user far greater than that from user to computer. The movement
of a user’s eyes can provide a convenient, natural, and high-bandwidth source
of additional user input, to help redress this imbalance. We therefore investigate
the introduction of eye movements as a computer input medium. Our
emphasis is on the study of interaction techniques that incorporate eye movements
into the user-computer dialogue in a convenient and natural way. This
chapter describes research at NRL on developing such interaction techniques
and the broader issues raised by non-command-based interaction styles. It
discusses some of the human factors and technical considerations that arise in
trying to use eye movements as an input medium, describes our approach and
the first eye movement-based interaction techniques that we have devised and
implemented in our laboratory, reports our experiences and observations on
them, and considers eye movement-based interaction as an exemplar of a new,
more general class of non-command-based user-computer interaction.
INTRODUCTION
In searching for better interfaces between users and their computers, an additional mode
of communication between the two parties would be of great use. The problem of humancomputer
interaction can be viewed as two powerful information processors (human and computer)
attempting to communicate with each other via a narrow-bandwidth, highly constrained
interface [25]. Faster, more natural, more convenient (and, particularly, more parallel, less
sequential) means for users and computers to exchange information are needed to increase the
useful bandwidth across that interface.
Outline
This chapter begins by discussing the non-command interaction style. Then it focuses on
eye movement-based interaction as an instance of this style. It introduces a taxonomy of the
interaction metaphors pertinent to eye movements. It describes research at NRL on developing
and studying eye movement-based interaction techniques. It discusses some of the human factors
and technical considerations that arise in trying to use eye movements as an input medium,
describes our approach and the first eye movement-based interaction techniques that we have
devised and implemented in our laboratory, and reports our experiences and observations on
them. Finally, the chapter returns to the theme of new interaction styles and attempts to identify
and separate out the characteristics of non-command styles and to consider the impact of
these styles on the future of user interface software.
NON-COMMAND INTERFACE STYLES
Eye movement-based interaction is one of several areas of current research in humancomputer
interaction in which a new interface style seems to be emerging. It represents a
change in input from objects for the user to actuate by specific commands to passive equipment
that simply senses parameters of the user’s body. Jakob Nielsen describes this property
The fifth generation user interface paradigm seems to be centered around noncommand-
based dialogues. This term is a somewhat negative way of characterizing
a new form of interaction but so far, the unifying concept does seem to be exactly
the abandonment of the principle underlying all earlier paradigms: That a dialogue
has to be controlled by specific and precise commands issued by the user and processed
and replied to by the computer. The new interfaces are often not even dialogues
in the traditional meaning of the word, even though they obviously can be
analyzed as having some dialogue content at some level since they do involve the
exchange of information between a user and a computer. The principles shown at
CHI’90 which I am summarizing as being non-command-based interaction are eye
tracking interfaces, artificial realities, play-along music accompaniment, and agents
[19].
PREVIOUS WORK
While the current technology for measuring visual line of gaze is adequate, there has been
little research on using this information in real time. There is a considerable body of research
using eye tracking, but it has concentrated on using eye movement data as a tool for studying
motor and cognitive processes [14, 18]. Such work involves recording the eye movements and
subsequently analyzing them; the user’s eye movements do not have any effect on the computer
interface while it is in operation.
For use as a component of a user interface, the eye movement data must be obtained in
real time and used in some way that has an immediate effect on the dialogue. This situation
has been studied most often for disabled (quadriplegic) users, who can use only their eyes for
input. (e.g., [11, 15, 16] report work for which the primary focus was disabled users). Because
all other user-computer communication modes are unavailable, the resulting interfaces are
rather slow and tricky to use for non-disabled people, but, of course, a tremendous boon to
their intended users.