03-10-2012, 11:09 AM
Vision-Based Eye-Gaze Tracking for Human Computer Interface
Vision-Based Eye-Gaze.pdf (Size: 955.21 KB / Downloads: 49)
Abstract
Eye-gaze is an input mode which has the potential of an
efficient computer interface. Eye movement has been
the focus of research in this area. Non-intrusive eyegaze
tracking that allows slight head movement is addressed
in this paper. A small 2D mark is employed as
a reference to compensate for this movement. The iris
center has been chosen for purposes of measuring eye
movement. The gaze point is estimated after acquiring
the eye movement data. Preliminary experimental results
are given through a screen pointing application.
Introduction
Human-computer interaction has become an increasingly
important part of our daily lives. The movement
of user’s eyes can provide a convenient, natural and
high-bandwidth source of input. By tracking the direction
of gaze of the user, the bandwidth of communication
from the user to the computer can be increased by
using the information about what the user is looking
at, and even designing objects specially intended for
the user to look at.
A variety of eye-gaze (eye-movement) tracking
techniques have been reported in the literature [l].
A short list includes Electra-Oculography [2], Limbus,
Pupil and Eyelid Tracking [3, 4, 5, 6, 7, 8, 91, Contact
Lens Method, Cornea1 and Pupil Reflection Relationship
[5, 4, 81, Purkinje Image Tracking, Artificial Neural
Networks .
Tracking of Eye Movement
The location of face and eye should be known for tracking
eye movements. We assume this location information
has already been obtained through extant techniques.
Exact eye movements can be measured by
special techniques. This investigation concentrates on
tracking eye movement itself. The primary goal of this
paper is to detect the exact eye position. Two algorithms
have been proposed for iris center detection:
the Longest Line Scanning and Occluded Circular Edge
Matching algorithms. The emphasis is on eye movement
in this paper, not on face and eye location.
Rough eye position is not sufficient for tracking eyegaze
accurately. Measuring the direction of visual attention
of the eyes requires more precise data from eye
image.
Estimation of Gazing Point
As previous work has reported, gaze estimation with free
head movement is very difficult to deal with. The focus is
on estimating the orientation of the eyes with slight head
movement. It is very important to estimate it from the
image features and values measured at the stage of eye
movement tracking. The direction of eye-gaze, including
the head orientation is considered in this investigation. A
geometric model incorporating a reference has been devised.
The geometry consisting of subject’s face, camera,
and computer screen has been explored so as to understand
eye-gaze in this environment. Finally, a couple of estimation
methods have been proposed.
Conclusion
Non-intrusive vision-based eye-gaze tracking methods involving
eye movement tracking (not the eye location, but
the iris center tracking) and gaze estimation have been investigated
in this paper. Practical feasibility of the techniques
has been demonstrated by using them as one type of
computer interface (the substitute for a pointing device).
The subject is allowed to move slightly, in a natural way.
The eye-gaze is computed by finding correspondences between
points in a model of face and points in the camera
image.
Vision-Based Eye-Gaze.pdf (Size: 955.21 KB / Downloads: 49)
Abstract
Eye-gaze is an input mode which has the potential of an
efficient computer interface. Eye movement has been
the focus of research in this area. Non-intrusive eyegaze
tracking that allows slight head movement is addressed
in this paper. A small 2D mark is employed as
a reference to compensate for this movement. The iris
center has been chosen for purposes of measuring eye
movement. The gaze point is estimated after acquiring
the eye movement data. Preliminary experimental results
are given through a screen pointing application.
Introduction
Human-computer interaction has become an increasingly
important part of our daily lives. The movement
of user’s eyes can provide a convenient, natural and
high-bandwidth source of input. By tracking the direction
of gaze of the user, the bandwidth of communication
from the user to the computer can be increased by
using the information about what the user is looking
at, and even designing objects specially intended for
the user to look at.
A variety of eye-gaze (eye-movement) tracking
techniques have been reported in the literature [l].
A short list includes Electra-Oculography [2], Limbus,
Pupil and Eyelid Tracking [3, 4, 5, 6, 7, 8, 91, Contact
Lens Method, Cornea1 and Pupil Reflection Relationship
[5, 4, 81, Purkinje Image Tracking, Artificial Neural
Networks .
Tracking of Eye Movement
The location of face and eye should be known for tracking
eye movements. We assume this location information
has already been obtained through extant techniques.
Exact eye movements can be measured by
special techniques. This investigation concentrates on
tracking eye movement itself. The primary goal of this
paper is to detect the exact eye position. Two algorithms
have been proposed for iris center detection:
the Longest Line Scanning and Occluded Circular Edge
Matching algorithms. The emphasis is on eye movement
in this paper, not on face and eye location.
Rough eye position is not sufficient for tracking eyegaze
accurately. Measuring the direction of visual attention
of the eyes requires more precise data from eye
image.
Estimation of Gazing Point
As previous work has reported, gaze estimation with free
head movement is very difficult to deal with. The focus is
on estimating the orientation of the eyes with slight head
movement. It is very important to estimate it from the
image features and values measured at the stage of eye
movement tracking. The direction of eye-gaze, including
the head orientation is considered in this investigation. A
geometric model incorporating a reference has been devised.
The geometry consisting of subject’s face, camera,
and computer screen has been explored so as to understand
eye-gaze in this environment. Finally, a couple of estimation
methods have been proposed.
Conclusion
Non-intrusive vision-based eye-gaze tracking methods involving
eye movement tracking (not the eye location, but
the iris center tracking) and gaze estimation have been investigated
in this paper. Practical feasibility of the techniques
has been demonstrated by using them as one type of
computer interface (the substitute for a pointing device).
The subject is allowed to move slightly, in a natural way.
The eye-gaze is computed by finding correspondences between
points in a model of face and points in the camera
image.