09-09-2013, 03:42 PM
Detection of Movements of Head and Mouth to Provide Computer Access for Disabled
Detection of Movements.pdf (Size: 225.97 KB / Downloads: 21)
Abstract
This paper analyzes the biometric identification and
tracking related technologies of human-computer interaction.
Based on Adaboost face detection algorithm, we propose a
position-based head motion detection algorithm, which does
not depend on the specific biometric identification and
tracking. It uses feature classification method to detect
mouth’s opening and closing actions. We also design a software
system to operate computer by image detection of head and
mouth movements. The combinations of head and mouth
movements, are mapped to various mouse events, including
move, click and drag, and so on. This system can be used for
the upper limb disabled who failed to use the traditional mouse
and keyboard. Furthermore, it can also be used for general
computer users to do neck rehabilitation training, computer
somatic games, etc.
INTRODUCTION
The common computer input devices, such as a mouse,
keyboard, are designed for normal capable user. It is
difficult for the upper limb disabled to manipulate a mouse
or keyboard, then they could not use the computer like
normal person.
In order to facilitate people with disabilities to use
computers, scientists and engineers have done a lot of
studies. Some earlier methods used the auxiliary
equipments, such as infrared sensors and infrared reflectors,
to detect the movements of computer user. Evans et al. used
infrared light-emitting diodes and photodetectors as
auxiliary equipment to determine the user's head position to
operate the computer[1]. Takami et al. invented a computer
interface device, which places the transmitter over the
monitor and uses an infrared reflector that is attached to the
user’s forehead or glasses[2]. Chen et al developed a system
that contains an infrared transmitter mounted onto the user’s
eyeglasses, a set of infrared receiving modules that
substitute the keys of a keyboard, and a tongue-touch panel
to activate the infrared beam[3]. Hutchinson et al studied the
eye gaze direction to operate the computer by measuring the
corneal reflection[4].
ALGORITHMS FOR DETECTION OF HEAD AND MOUTH
MOVEMENTS
The system analyzes the relationship between different
combinations of the detected head and mouth movements,
and then maps them to mouse events of computer system.
A. The Basis of Head Movements
The face detection is implemented by using Adaboost
algorithm in our system, in which each frame of video
streaming captured by a camera (30 frames per second) is
input signal [7]. Then we propose an algorithm for detection
of head movements by analyzing face locations. We defined
five motions as the basis of head movements, namely,
Algorithm for Detection of Head Movements
First the system captures images by camera, then
detects the head area in the images. Let the origin
coordinates (0, 0) be at the top left corner in the Figure. And
the horizontal and vertical coordinate are noted x and y
respectively, shown as an example in Fig. 2(a). The
coordinate values are calculated in pixels. The rectangle
which frames the face is the detected head area. We
calculate the geometric center of the rectangle, and name it
as head central coordinates, i.e. (Sx, Sy) in Fig. 2(b). Then
we can analyze the specific head movement by time series
relationship of the central coordinates. The algorithm
HEADMOVE for detection of head movements is described
as below.
Cursor of Head-Trace Mouse System
In this system, user does not use a traditional mouse, so
we designed a special mouse cursor to replace the traditional
cursor in order to help user to operate the system easily. The
mouse cursor is a graph includes five parts (up, down, left,
right and center) and a traditional cursor at upper left corner,
which we call five-direction-graph cursor or five-direction-
graph for brief, shown as examples in Fig. 6. In practice,
mouse events are mapped into mouse cursor shown on the
screen. Unless noted, we regard the words “mouse” and
“mouse cursor” the same for the purpose of convenient and
uniform description (in fact, it is easy to distinguish them
from context). Fig. 6 shows part of mouse events. There are
other mouse events, each of which is similar to the relevant
sub-graph. So we do not list all the five-direction-graphs
here. For example, mouse moving right is similar to Fig.
6(b), mouse moving down is similar to Fig. 6©, and double
click, right click, drag, etc. are similar to Fig 6(e).
CONCLUSIONS
Based on Adaboost face detection algorithm, this paper
discussed the detection algorithms of head movements
including head up, down, left and right, as well as opening
mouth and closing mouth. On the basis of these algorithms,
we designed the Head-Trace Mouse software system for the
disabled, which can replace the traditional mouse, by
detecting user’s head and mouth movements through a
camera. The system has been offered to a number of the
upper limb disabled, and obtained positive evaluations from
them. Now it has been developed into a commercial product
and listed in the purchasing catalog of the China Assistive
Devices for Person with Disabilities.