26-06-2012, 12:30 PM
Seminar Report on Brain–Machine Interfaces (BMIs)
Brain–Machine Interfaces.docx (Size: 1.12 MB / Downloads: 57)
. INTRODUCTION
For the past 40 years researchers have been trying to transform thought into action. Recent advances in neuroscience and neurotechnology have initiated a renewed interest in the development of brain–machine interfaces (BMIs) or brain–computer interfaces (BCIs). The human brain is the centre of the human nervous system. It gives us the power to see, hear, smell, taste, touch, feel, think, plan, speak, imagine. Brain is made of approximately 100 billion nerve cells called neurons.When we talk about high end computing and intelligent interfaces, we just cannot ignore robotics and artificial intelligence. Researchers are close to breakthroughs in neural Researchers interfaces, meaning we could soon mesh our minds with machines. This technology has the capability to impact our lives in ways that have been previously thought possible in only sci-fi movies. Advances in cognitive neuroscience and brain imaging technologies give us unprecedented ability to interface directly with brain activity. These technologies let us unprecedented monitor the physical processes in the brain that correspond to certain forms of thought. Driven by society growing recognition of the needs of people with physical disabilities, ties researchers have begun using these technologies to build Brain Computer Interface (BCI) communication systems that do not depend on the brains normal output pathways of peripheral nerves and muscles.
GENERAL PRINCIPLE BEHIND BCI
Main principle behind this interface is the bioelectrical activity of nerves and muscles. It is now well established that the human body, which is composed of living tissues, can be considered as a power station generating multiple electrical signals with two internal sources, namely muscles and nerves.
We know that brain is the most important part of human body. It controls all the emotions and functions of the human body. The brain is composed of millions of neurons. These neurons work together in complex logic and produce thought and signals that control our bodies. When the neuron fires, or activates, there is a voltage change across the cell, (~100mv) which can be read through a variety of devices. When we want to make a voluntary action, the command generates from the frontal lobe. Signals are generated on the surface of the brain. These electric signals are different in magnitude and frequency.
THE BRAIN MACHINE INTERFACE
A brain-machine interface (BMI) is an attempt to mesh our minds with machines. It is machine a communication channel from a human's brain to a computer, which does not resort to the usual human output pathways as muscles. It is about giving machine-like capabilities to m like intelligence, asking the brain to accommodate synthetic devices, and learning how to control those devices much the way we control our arms and legs today. These experiments lend hope that people with spinal injuries will be able to someday use their brain to control a prosthetic limb, or even their own arm. A BMI could, e.g., allow a paralyzed patient to convey her/his intentions to a computer program. But also applications in which healthy users can benefit from the direct brain computer communication are conceivable, e.g., to speed up reaction times. Initially theses interactions are with peripheral devices, but ultimately it may be interaction with another brain. The first peripheral devices were robotic arms. Our robotic approach bases on an artificial neural network that recognizes and classifies different brain activation patterns associated with carefully selected mental tasks. Using BMI artificial electrical signal can stimulate the brain tissue in order to transmit some particular sensory order information.
EARLY WORK
Studies that developed algorithms to reconstruct movements from motor cortex neurons, which control movement, date back to the 1970s. Work by groups in the 1970s established that monkeys could quickly learn to voluntarily control the firing rate of individual neurons in the primary motor cortex via closed-loop operant conditioning. There has been rapid development in BCIs since the mid-1990s. Several groups have been able to capture complex brain motor centre signals using recordings from neural ensembles (groups of neurons) and use these to control external devices. The first Intra-Cortical Brain-Computer Interface was built by implanting neurotrophiccone electrodes into monkeys. In 1999, researchers decoded neuronal firings to reproduce images seen by cats. The team used an array of electrodes embedded in the thalamus of sharp-eyed cats. Researchers targeted 177 brain cells in the thalamus lateral geniculate nucleus area, which decodes signals from the retina. Neural ensembles are said to reduce the variability in output produced by single electrodes, which could make it difficult to operate a Brain Computer Interface. After conducting initial studies in rats during the 1990s, researchers developed Brain Computer Interfaces that decoded brain activity in owl monkeys and used the devices to reproduce monkey movements in robotic arms. Researchers reported training rhesus monkeys to use a Brain Computer Interface to track visual targets on a computer screen with or without assistance of a joystick (Closed-Loop Brain Computer Interface).
’ BRAINGATE’ BRAIN COMPUTER INTERFACE
An implantable, Brain Computer Interface, has been clinically tested on humans by American company Cyberkinetics. The ‘BrainGate’ device can provide paralyzed or motorimpaired patients a mode of communication through the translation of thought into direct computer control. The technology driving this breakthrough in the Brain Machine Interface field has a myriad of potential applications, including the development of human augmentation for military and commercial purposes. The sensor consists of a tiny chip with one hundred electrode sensors each that detect brain cell electrical activity. The chip is implanted on the surface of the brain in the motor cortex area that controls movement. The computers translate brain activity and create the communication output using custom decoding software.