Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: ELECTROOCULOGRAM BASED VIRTUAL KEYBOARD SEMINAR REPORT
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
ELECTROOCULOGRAM BASED VIRTUAL KEYBOARD SEMINAR REPORT

[attachment=54203]

ABSTRACT

The design and application of an electrooculogram (EOG) based on an efficient human-computer
interface (HCI) is presented here. Establishing an alternative channel without speaking and hand
movements is important in increasing the quality of life for the handicapped. EOG-based systems are
more efficient than electroencephalogram (EEG)-based systems in some cases. By using a realized
virtual keyboard, it is possible to notify in writing the needs of the patient in a relatively short time.
Considering the biopotential measurement pitfalls, the novel EOG-based HCI system allows people
to successfully communicate with their environment by using only eye movements. Classifying horizontal
and vertical EOG channel signals in an efficient interface is realized in this study. The new
system is microcontroller based, with a common-mode rejection ratio of 88 dB, an electronic noise
of 0.6 mV (p-p), and a sampling rate of 176 Hz. The nearest neighborhood algorithm is used to
classify the signals, and the classification performance is 95%. The novel EOG-based HCI system
allows people to successfully and economically communicate with their environment by using only
eye movements.

INTRODUCTION

Communication is essential for humans to lead daily life as a member of society. However developmentally
disabled individuals with motor paralysis such as amyotrophic lateral sclerosis (ALS)
Guillain-Barre-Syndrome, or brain-stem infarction have difficulty conveying their intentions because
the motor neurons influencing voluntary muscles are affected. Various assistive technologies that support
individual communication have been developed for disabled people based on a brain-computer
interface. Some have facilitated communication with others by supplementing their impaired functions
with surviving functions. In cases of terminal ALS patients, the eye movement muscles are not
typically affected. These patients can only move their eyeballs. Establishing a new channel without
overt speaking and hand/arm motions makes life easier for patients and therefore improves their life
quality. Paralyzed stroke patients are unable to normally communicate with their environment. For
these patients, the only part of their body that is under their control, in terms of muscular movement, is
their eyeballs. In addition to the patients previously mentioned, giving messages on computer screens
and controlling a wheel chair or robot arm without muscular movements are useful for the elderly as
well. It is assumed that the population of people aged 60 and beyond will range from one to three
in 2030. Considering the life span extension and the handicapped, the need for a human-computer
interface (HCI) has been increasing.

HUMAN COMPUTER INTERFACE

With the invention of the computer in the middle of the last century there was also the need of
an interface for users. In the beginning experts used teletype to interface with the computer. Due
to the tremendous progress in computer technology in the last decades, the capabilities of computers
increased enormously and working with a computer became a normal activity for nearly everybody.
With all the possibilities a computer can offer, humans and their interaction with computers are now
a limiting factor. This gave rise to a lot of research in the field of HCI (human computer interaction)
aiming to make interaction easier, more intuitive, and more efficient. Interaction with computers is
not limited to keyboards and printers anymore. Different kinds of pointing devices, touch-sensitive
surfaces, high-resolution displays, microphones, and speakers are normal devices for computer interaction
nowadays. There are new modalities for computer interaction like speech interaction, input by
gestures or by tangible objects with sensors. A further input modality is eye gaze which nowadays
finds its application in accessibility systems. Such systems typically use eye gaze as the sole input, but
outside the field of accessibility eye gaze can be combined with any other input modality. Therefore,
eye gaze could serve as an interaction method beyond the field of accessibility. The aim of this work
is to find new forms of interactions utilizing eye gaze and suitable for standard users.
The interface that provides control of machines for disabled people is called man machine interface
(MMI) in general. If control can be made by using a computer-based (or microcomputer-based)
system, it is called HCI, instead of MMI, which has the same meaning. The electrical signals generated
by the human brain that are related to body functions are called an electroencephalogram (EEG).
If the assistive system is based on EEG, it is called the brain computer interface (BCI), and its applications
for severely disabled people are increasing. An electrocorticogram (ECoG) can also be
used for BCI control signals. Although they are quite expensive, compared with EEG-based systems,
magnetoencephalography-based systems can be used for BCI. Here,the focus is only on EEG based
BCI systems to compare EOG-based systems.

EEG-BASED HCI SYSTEMS

Electroencephalography (EEG) is the most studied potential non-invasive interface, mainly due to
its fine temporal resolution, ease of use, portability and low set-up cost. But as well as the technology’s
susceptibility to noise, another substantial barrier to using EEG as a brain computer interface is
the extensive training required before users can work the technology. For example, in experiments beginning
in the mid-1990s, Niels Birbaumer at the University of Tubingen in Germany trained severely
paralysed people to self-regulate the slow cortical potentials in their EEG to such an extent that these
signals could be used as a binary signal to control a computer cursor. (Birbaumer had earlier trained
epileptics to prevent impending fits by controlling this low voltage wave.) The experiment saw ten
patients trained to move a computer cursor by controlling their brainwaves. The process was slow,
requiring more than an hour for patients to write 100 characters with the cursor, while training often
took many months.
EEG-based systems are the most commonly used in HCI applications because of the possibility
of noninvasive measurement on the scalp. BCI systems are generally EEG-based systems and can
translate brain activity into electrical signals that control external devices. BCI systems can provide a
communication and control channel that bypasses conventional neuromuscular pathways involved in
speaking or making movements to manipulate objects. The control signal can be used for a spelling
device and in controlling the cursor on the computer monitor. The aim of BCI systems is to enable
completely paralyzed patients (locked-in syndrome) to communicate with their environment. BCI
systems are anticipated to play an important role in the development of assistive and therapeutic
technologies for paralyzed patients, prosthesis or orthosis control, and movement rehabilitation after
stroke or spinal cord injury. They can be a technology for severely paralyzed patients for increasing
or maintaining their communication and control options, and in turn, increasing their life quality.

EOG-BASED HCI SYSTEMS

EOG signal is based on electrical potential difference between the cornea and retina when eye
movement is realized. The amplitude of this signal ranges between [50,3500] mV and its frequency
components go from 0 to 100Hz. Biomedical signals used in the present study are EOG: horizontal
and vertical eye movements and voluntary eye blinks generate electrical activity. Therefore, positive
potential is charged at cornea side, whereas negative potential is charged at the retina side, constant
potential difference (corneal-retinal potential) is charged between the cornea and the retinas. Body
surface electrodes situated around the eyeball socket can detect potential changes according to eye
movements. These signals show certain patterns for each kind of eye movement (left, right, up,
down, and blink). These signal patterns can be recognized, and then, the acquired signals can be
used for controlling external devices, such as virtual keyboards, powered wheelchairs, movable arms,
and robots. An EOG-based virtual keyboard provides a means for paralyzed patients to type letters
onto a monitor with eye movements without using the normal keyboard. Most of the research in this
research field focused on translating four eye movements (left, right, up, and down) and eye blink to
select characters from the monitor for typing onto the screen (i.e., speller).

EOG MEASUREMENT

All biosignal recordings in EOG signal measurement electrodes are the initial elements that are
used for converting biopotential signals due to biopotential sources into electrical signals. Fig. 2
shows the simplified biopotential measurement. EOG voltage changes can be measured from the
surface electrodes. Silver chloride (AgCl) is preferred for biosignal measurements. Because Ag is a
slightly soluble salt, AgCl quickly saturates and comes to equilibrium. Therefore, Ag is a good metal
for metallic skin-surface electrodes. Because EOG signals are caused by muscles, EMG electrodes
are more convenient for their measurement; however, EEG electrodes can be used for EOG signal
measurements. Contact impedance should be less than 10 kW over the frequency range of 30 to 200
Hz. In general, in EOG signal acquisition systems, a reference/ground electrode is placed on the
forehead; electrodes are placed on the right and left temples for horizontal (lateral) eye movement
detection and above and below an eye for vertical eye movement detection. Horizontal and vertical
movements are measured and plotted on different channel voltages, or the EOG can be calibrated
for each subject to provide angles of eye movement in both the horizontal and vertical planes. EOG
signals are roughly in the band of about 0 to 100 Hz and 50 to 3500 mV. Measurement and processing
of the EOG signal are easier than those of EEG (< 100 mV) signals, because, compared with EEG
signals, EOG signals have greater amplitude. Horizontal and vertical eye movements and eye blinking
generate easily distinguishable EOG signals, even in time series, without performing preprocessing.
Hence, they do not need to be averaged or to use other sophisticated signal processing methods and
classification algorithms, which EEG signal processing requires.

CONCLUSION

Crucial factors in the design of an EOG-based system include subject/patient safety, power line
noise reduction, and keeping signal originality, which are also true for all other biopotential measurement
systems. In this system, to remove the dc level and reduce the power line noise, the summing
opposite phase approach has been used. The results are quite sufficient. If the biopotential signal
is acquired in a larger amplitude, it is less affected by electronic noise. Because the EOG signals
are relatively higher than the EEG signals, the EOG-based HCI is more efficient than the EEG-based
HCI. The EOG signals can practically be acquired and conditioned. The new system is microcontroller
based, with a common-mode rejection ratio of 88 dB, an electronic noise of 0.6 mV (p-p), and
a sampling rate of 176 Hz. The nearest neighborhood algorithm is used to classify the signals, and the
classification performance is 95%. The novel EOG-based HCI system allows people to successfully
and economically communicate with their environment by using only eye movements.