29-11-2012, 06:20 PM
Adaptive Brain Interfaces (ABI)
Adaptive Brain Interfaces.doc (Size: 823.5 KB / Downloads: 27)
ABSTRACT
Adaptive Brain Interfaces (ABI) is a part of European Union Information Technology’s ESPRIT programme, with the central aim of extending the capabilities of physically-impaired people to access new services and opportunities. The ABI is a portable brain-computer interface based on the analysis of electroencephalogram (EEG) signals and interface of P300 based speller.
A cap with a few integrated electrodes acquires brain signals that are pre-processed and sent to a computer for further analysis. The portable brain interface has an embedded neural network classifier that recognizes what mental task the wearer is concentrating on. It does so by analyzing continuous variations of EEG signals over several cortical areas of the brain. Each mental task is associated to a simple command. This enables people to communicate using their brain activity, as the interface only requires users to be conscious of their thoughts and to concentrate sufficiently on the mental expression of the commands required to carry out the desired task. So, by composing command sequences (thoughts), the user can read a web page, interact with games, turn on appliances, or even guide a wheelchair.
Brain interface will be most successful when it is adapted to its owner. The approach is based on a mutual learning process where the user and the ABI interface are coupled together and adapt to each other. The neural network has been specifically designed to cope with the challenging problem of recognizing mental tasks from spontaneous on-line EEG signals. Although the immediate application of ABI is to help physically disabled or impaired people by increasing their independence and facilitating access to the Information Society, the benefits of such a system are extensive. Anyone can use it for other purposes, e.g. health and safety concerns (e.g. monitoring a person's level of alertness). ABI could also contribute to the medical diagnosis of brain disorders.
INTRODUCTION
Adaptive Brain Interface (ABI) is a human computer interface system that accepts voluntary commands directly from the brain to interact with the surrounding environment or to do a particular task. Sometimes it is called a direct neural interface, Brain computer interface or a brain-machine interface. It is a direct communication pathway between a brain and an external device. BCIs were aimed at assisting, augmenting or repairing human cognitive or sensory-motor functions. The approach, on which the ABI is based, as the name implies, is the adaptiveness. That means that both the system and the user adapt to each other as explained before. In ABI the adaptive part is the local neural classifier which is responsible for classifying input signal, and the user adapts by training in the chosen mental tasks which he/she finds most comfortable and effective to use. Second important approach is that this system should also work reliably outside laboratory environment, i.e. in normal everyday life. This calls for an easy to use, wearable (small and light) system.When compared with other BCIs, one of the ABI’s areas of good performance is the time required for training. User can acquire good control over the system just in five days.
RESEARCH HISTORY WITH FACTS
Monkeys in North Carolina have remotely operated a robotic arm 600 miles away in MIT's Touch Lab using their brain signals.The feat is based on a neural-recording system. In that system, tiny electrodes implanted in the animals' brains detected their brain signals as they controlled a robot arm to reach for a piece of food.
According to the scientists from Duke University Medical Center, MIT and the State University of New York (SUNY) Health Science Center, the new system could form the basis for a brain-machine interface that would allow paralyzed patients to control the movement of prosthetic limbs.The Internet experiment "was a historic moment, the start of something totally new," Mandayam Srinivasan, director of MIT's Touch Lab, said in a November 15 story in the Wall Street Journal.The work also supports new thinking about how the brain encodes information, by spreading it across large populations of neurons and by rapidly adapting to new circumstances.
WORKING OF ADAPTIVE BRAIN INTERFACE
Electrodes placed on the scalp or within the head acquire signals from the brain, and the BCI system processes them to extract specific signal features that reflect the user’s intent. The BCI translates these features into commands that operate a device—for example, a word-processing program, speech synthesizer, robotic arm, or wheelchair.