25-08-2017, 09:32 PM
1461147825-BrainGate.pdf (Size: 3.14 MB / Downloads: 25)
AT THE INTERFACE OF
BRAIN AND MACHINE
BY DAN BACHER
One day in late 2010, something remarkable happened that
changed my life. I had been leading a project to develop a communication
system for people with locked-in syndrome as part
of my group’s work with the BrainGate Neural Interface System
(NIS). For this specifi c project, our objective was to create an
interface that would allow users to communicate using only
their thoughts. I was responsible for developing the virtual
keyboard soft ware and integrating it with the NIS.
On that day in 2010, the plan was to test my keyboard
interface with clinical trial participant S3. At the time, her
usual method of communicating was to slowly move her
eyes to individual letters printed on a clear piece of plastic,
while a person behind the plastic would record each letter
she chose. But on this day, she would use only her thoughts
to move and click a computer cursor to type with my onscreen
keyboard.
S3’s eyes lit up when she saw the keyboard. I was trying to demonstrate some of the features when instead she defi antly
started typing on her own: fi rst “thank,” then “you.” Th ose two
simple words—so commonly and automatically exchanged—
were the most powerful words that had ever been spoken to me.
(I do mean spoken: S3 used the built-in text-to-speech feature
I’d integrated to have the computer speak her message.) Th is
transformative moment was the fi rst of what would become a
series of exciting, humbling, and emotional experiences with S3
and other participants in the BrainGate clinical trial.
In the following months, I worked with a team of engineers
to create soft ware that could translate the BrainGate system’s
command signals into coordinated movements of an advanced
robotic arm. Months of long hours of developing, refi ning, and
validating our soft ware were put to the test in April 2011. I
was by S3’s side once again when she used this robotic arm
to give herself a drink of coff ee. Controlling the robotic arm
only with her imagined movements, she reached out, picked
up a bottle, took a drink, and put the bottle back down onto
the table—a feat she last performed with her own arm nearly
15 years earlier.
DECODING NEURAL SIGNALS
BY BEATA JAROSIEWICZ, PhD
I am a neuroscientist by training, but during the course of my
research career, I have learned computer programming skills
that have become crucial to my work on BrainGate. My focus
has been on using my neuroscience knowledge to help improve
the computer programs that decode neural signals associated
with the intent to move a limb.
Th e starting point of the BrainGate neural interface system
is an electrode array placed in the hand/arm area of the motor
cortex. Th ese electrodes record action potentials, or “spikes,”
from neurons. When the person opens or closes (or imagines
opening or closing) her hand, we find some neurons that
increase or decrease their spiking rate. Other neurons change
their spike rate for diff erent intended directions of movement.
For example, one neuron might increase its spiking rate for a
rightward arm movement and decrease its rate for a left ward
movement. Th at neuron would be said to have a “preferred
direction” to the right. Other nearby neurons might have preferred
directions to the left , up, down, forward, backward, or
anywhere in between.
We begin each research session with our study participants by fi guring
out how each recorded neuron’s fi ring rate modulates with intended movements.
We do this by displaying a cursor programmed to move to targets that
appear one by one on a computer monitor while the participant imagines
using her hand to move the cursor. During this calibration, a computer
registers the spike rate of each neuron. Th en, using the spiking information
and the imagined movement information, the computer creates a model of
each recorded neuron’s preferred direction.
UNTETHERING THE LOCKED-IN MIND BY DAVID BORTON, PhD
As a neuroengineer, I try to solve neuroscience problems with the use of
modern technology, such as custom electrical circuits, chips, and soft ware.
People oft en say that technological advances have made the unthinkable possible,
but in the case of neuroengineering, they’ve made the thinkable possible.
Th e human brain consists of more than 80 billion neurons making over
100 trillion connections. Th ese neurons communicate with each other by
sending electrical pulses, called action potentials, along their long axons
and to neighboring neurons. How do we listen to, and make sense of, so
many signals? Neuroengineers have already met one part of the challenge
by designing specialized “microphones” that can sense the millions of action
potentials every second as the neurons communicate with one another.
Currently, using a microelectrode array of 100 recording elements, we
can listen to the activity of roughly 100 individual neurons at once. In the
BrainGate project, these signals are transmitted outside the body through
a long cable, amplifi ed to distinguish them from background noise in the
brain, digitized into binary code, and processed with computational algorithms
to decode what the signals might mean.
While this method of neural recording works incredibly well, when we
look to a future when locked-in patients are moving their own limbs to
walk down the hallway, we realize that the transmission of all this neural
data must be done wirelessly. To achieve this, we must reinvent the amplifi
er, digitizer, and data transmission mechanisms so they can be implanted
in the patient.
Th e amplifi ers currently used by the BrainGate team are the size of a
hardback book, and the digital signal processors take up the majority of a
personal computer’s memory. To create smaller electronics, we leveraged
advances in microelectronics ranging from chip design to fl exible printed
circuit board technology. We have designed custom ultra-low-power
application-specifi c integrated circuits (or ASICs), amplifi ers the size of
an m&m, and integrated digitization circuitry—and put all of this into
a device the size of a U.S. quarter. Th rough an encoded high-frequency
radio transmission scheme similar to 4G LTE, this device transmits the
digital neural data from the patient to a computer across the room, where
it can be processed into prosthetic control signals. Th e device is packaged
PUTTING RESEARCH TO THE TEST BY ERIN GALLIVAN
One aft ernoon at work, S3 and I were having a conversation
much like one you would have with any friend or coworker.
She was telling me a story about her grandson, recalling
his reaction to a present he received on his birthday. Our
conversation, however, looked anything but typical: I was
holding up a clear letter board as she focused her eyes on
individual letters. When I met her gaze through the board,
I said the letter out loud; if I was correct, we would move on
to the next one, spelling out the words and sentences that
made up the conversation.
With the BrainGate system, I have since seen S3 use typing
interfaces to quickly spell out phrases on a computer. Our goal
is that people will someday be able to use the system 24/7,
without any assistance. For now, because our study is still in
Phase 1, to evaluate the device’s safety, participants can use the
BrainGate system only when a trained technician, like me, is
present during our twice-weekly research sessions.
My job as a clinical research assistant is to run these sessions.
On a typical day, I travel to the participant’s home and ask him or
her to consent to a research session. Aft er downloading the session
soft ware sent by the research scientists, I set up the neural
connection by attaching a cable to the connector implanted on
the participant’s head. I then explain the experiment, describing
the task and what type of imagery they should use to complete
the task, which varies from moving a computer cursor to operating
a robotic arm. Sessions run for about four hours, and I
interact with the participant the whole time, answering their questions and giving instructions, making a detailed log of the
session, and recording video. At the end of the session, I send
the data to the researchers at Brown for analysis.
It is part of my job to make sure that the technology we are
developing is easy and enjoyable to use, and the participants
off er a lot of great feedback and suggestions, which I pass on
to the rest of the team. For example, when using the keyboard
interfaces, participants off ered suggestions for making the
layout easier to navigate and adding shortcuts to help them
type faster. My primary goal is to collect and deliver the data,
but my personal interaction with our participants allows me to
see how the research will directly impact their lives.
Working with BrainGate has opened my eyes to all the
good that can be done through science, medicine, and engineering.
I will be leaving this position to go to medical school
next month, and although I am sad to be moving on, the work
that I have done here has made me realize that this is the correct
career path for me.