24-07-2012, 10:18 AM
Computers that read your mind
Computers that read your mind.pdf (Size: 230.55 KB / Downloads: 25)
DO YOU use the internet while watching television, listen to music while working at your computer, or
read e-mail while talking on the phone? According to Linda Stone, a former Microsoft and Apple executive,
this is the era of “continuous partial attention”. People flit constantly between technologies, yet never
devote their undivided attention to any of them, she observes. The e-mails, instant messages, text
messages, calendar alerts, telephone calls and the occasional, old-fashioned face-to-face conversation are
all competing for their share of your awareness. Part of the problem is that today's technologies lack the
intelligence to determine when to interrupt people—and, more importantly, when to leave them be.
Now a new class of technologies is being designed to help users to regain their focus and enjoy more
lucidity and concentration. The new field is known as “augmented cognition”, and it employs sensors to
infer the mental state of someone using a device. Rather than trying to read the user's mind directly—the
approach taken in a different field, known as brain-computer interfaces (BCI)—augmented cognition has a
subtly but crucially different aim. BCI devices are used to control things in the physical world, such as a
cursor on a screen, a wheelchair or even a prosthetic limb. Augmented cognition, in contrast, focuses on
deducing a cognitive state with the aim of somehow enhancing it.
So when someone is overwhelmed with information, an augmented cognition system would try to help him
cope by diverting some of it. Naturally enough, augmented cognition has captured the imagination of the
armed forces—the Pentagon's Defence Advanced Research Projects Agency (DARPA) is one of its biggest
backers. That is because today's military personnel are bombarded not just by the enemy, but also by
information, says Dylan Schmorrow, previously the founder and programme manager of DARPA's AugCog
programme. (Dr Schmorrow, who now works at the Office of Naval Research in Arlington, Virginia, will also
chair an international conference on augmented cognition which takes place next month in San
Francisco.)
Augmented cognition should be able to help soldiers and fighter pilots make sense of high volumes of data
and exploit them, rather than being swamped by them, says Dr Schmorrow. One idea, dubbed the
CogPit, is to work on a smart cockpit for fighter aircraft. The system monitors the pilot's use of
conventional controls but also takes readings from electroencephalogram (EEG) sensors built into a
helmet to measure brain activity. If the pilot is targeting the enemy and the aircraft detects a threat,
such as a surface-to-air missile, the CogPit system senses when to alert and possibly distract him with a
verbal or onscreen warning.
From changes in frequency of the pilot's brain waves and contextual information—such as how he is
interacting with the fighter's controls—the system can recognise when the pilot's task is too delicate to be
interrupted. If necessary, the system can even deploy countermeasures automatically, depending upon
how serious the threat is, says Blair Dickson of QinetiQ, Britain's privatised military-research agency, which
has been working on part of the CogPit project.
As well as determining how to treat new information, the system can also try to reduce stress on the pilot
by filtering out non-essential information, says Dr Schmorrow. If the EEG and pilot's behaviour indicate
that he is becoming overwhelmed, it can temporarily “grey out” less vital onscreen information and reduce
incoming radio communications. This should free the pilot to concentrate on the information that matters
most.
“A lot of people still think it is science fiction,” says Dr Schmorrow, “but there have been some profound
advances in the last six months.” In an assessment of the system, six F-16 pilots were recently asked to
carry out a mission in a CogPit simulator in which they faced a series of threats, including heat-seeking
surface-to-air missiles. Although the sample was too small to be statistically significant, the missions that
used the CogPit's adaptive-autopilot feature suffered less damage than those that did not, says Dr
Dickson.
Besides recording EEGs, the DARPA programme has also measured such things as heart rate, sweat,
pupil dilation and even posture. “The more involved you get into a task, the more you lean into the desk,”
says David Kobus of Pacific Science and Engineering in San Diego, California, who helped to set up DARPA's
AugCog programme. Such information can help reveal the mental state of an air-traffic controller, for
example.
Body talk
Even at noisy and chaotic times, such as during a military exercise, augmented cognition has worked
surprisingly well, says Dr Schmorrow. In trials where teams of four soldiers had to rescue a hostage,
head-up displays relayed information visually and vibrotactile vests gave soldiers navigational information
about where to go during an ambush. Given how unreliable some sensors can be, particularly EEG
sensors, the results were encouraging, says Dr Schmorrow. And in addition to helping individual soldiers,
the system can also benefit those in the command centre, by indicating when a particular unit has its
hands full or is too tired to move elsewhere.
The armed forces are not alone in wanting to get more out of people. Augmented cognition could also find
its way into work, too. Microsoft has been devising tools to improve productivity by working out what
people are doing. This is just what the company's infamous paperclip was supposed to do, popping up on
the screen and offering advice whenever its Bayesian engine statistically determined that the user needed
assistance. Yet the clip soon became an irritation, distracting the user and reducing productivity. (Part of
the problem was that Microsoft was so proud of the feature that it made the clip more prominent.)
Augmented cognition promises to be a different story, insists Eric Horvitz, a senior researcher at Microsoft
Research in Redmond, Washington, and president of the American Association for Artificial Intelligence.
The emphasis now is on filtering information before it reaches the user, he says. By controlling the flow of
information, it should be possible to increase the amount of information people can absorb without
overloading them, says Dr Horvitz.
The trick in getting this right lies in the ability to recognise cognitive limitations and biases. And unlike BCI
interfaces and many of the military applications, this does not call for brain sensors, he says. Plenty of
measures can tell you something about the user's state of mind: keystrokes, how many windows are open
Copyright © 2006 The Economist Newspaper and The Economist Group. All rights reserved.
and their content, whether the user is scrolling, the time of day, the contents of a desktop calendar—even
background noises from a microphone and visual information from a camera.
By analysing someone's behaviour during a training period, a program called Busy Body can learn to tell
what someone is doing and whether to interrupt him. The system can distinguish between types of
behaviour and incoming information, says Dr Horvitz. In effect, it gauges how busy the user is and the
urgency of the new data, whether in the form of e-mails, telephone calls or text messages. “We weigh the
cost of interruption against the benefits of seeing time-critical information,” says Dr Horvitz.
During a video-conference call, for example, the system might decide not to notify you that a piece of
spam had arrived in your mailbox. Similarly, it should know that you do not want to be disturbed by a
message from a friend just before the deadline for a report—but to put through a call from a colleague
who is working on the same project.
Dr Horvitz's team has also looked at how “models of interruption”, as he calls them, can filter out
information when people are driving. One program, called Short Stop, which runs on a smartphone, can
predict when a driver is likely to stop and how long that stop will last. The model is based on logs of more
than 18,000 miles (29,000km) covered in 2,500 trips made by Microsoft volunteers.
When a call comes in, the program looks at the position, speed, time of day and even the weather to
calculate the chances that the driver will stop soon. If a stop is likely, the call is answered and put on hold
until the car stops. A message informs the caller that the driver is busy, but may answer soon. If the
driver is likely to motor on, the call is diverted to voicemail.
The system is surprisingly accurate, says Dr Horvitz. It can predict whether a person will stop for just five
seconds, 35 seconds, or more than a minute, with an accuracy of about 80%. And rather than having to
be trained by each user the system appears to generalise well. Given the amount of information being
thrown at motorists, such a system could help take some of the stress out of driving.
Mind games
When playing video games, a little stress can be a good thing. There is a lot of interest in using
augmented cognition in gaming, says Dr Schmorrow, because by sensing a gamer's cognitive state you
can make it more fun. In particular, the technique can stop players getting bored or lost, says John Laird,
the director of the Artificial Intelligence Laboratory at the University of Michigan in Ann Arbor. In a
haunted-house role-playing game, Dr Laird has used augmented cognition to infer mental states and
determine whether the player needs help to get back on track and advance the game's story.
Although Dr Laird is basing this purely on the player's activity and behaviour within the game world,
others are using physiological sensors, too. Alan Dix, an expert in human and computer interaction at
Lancaster University in England, predicts that within a few years game consoles will come with a range of
sensors designed to measure a player's state of alertness. EmSense, a company based in Monterey,
California, is already pursuing this approach. It sells a lightweight headset that monitors a player's
brainwaves, heart rate and breathing. It can also tell whether the wearer is moving.
Working with his colleague Kiel Gilleade, Dr Dix linked a heart monitor to a first-person shooter—a game in
which the player views the world along the barrel of a gun. “The idea was to see if you can change the
gameplay to keep people at an optimal level of arousal,” says Dr Dix. As the player's heart rate drops, the
game becomes harder; but if he becomes too excited, it will start to ease off. “We don't want to kill them,”
says Dr Dix. The result is a more compelling game. On the battlefield, in the office, in the car and even at
home, there are good reasons to teach computers to read their users' minds.