13-04-2012, 02:00 PM
blue eyes
blue eyes.doc (Size: 39.5 KB / Downloads: 33)
ABSTRACT:
Animal survival depends on highly developed sensory abilities. Likewise, human cognition depends on highly developed abilities to perceive, integrate, and interpret visual, auditory, and touch information. Without a doubt, computers would be much more powerful if they had even a small fraction of the perceptual ability of animals or humans. Adding such perceptual abilities to computers would enable computers and humans to work together more as partners. Toward this end, the BlueEyes project aims at creating computational devices with the sort of perceptual abilities that people take for granted.
Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection.
Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and sensoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners. Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presents, talk, listen, or even guess their feelings.
The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusige sensing method, employing most modern video cameras and microphones to identifies the users actions through the use of imparted sensory abilities . The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states.
The basic idea behind this technology is to give the computer the human power. We all have some perceptual abilities. That is we can understand each others feelings. For example we can understand ones emotional state by analyzing his facial expression. If we add these perceptual abilities of human to computers would enable computers to work together with human beings as intimate partners. The "BLUE EYES" technology aims at creating computational machines that have perceptual and sensory ability like those of human beings.
Theory
Based on Paul Ekman's facial expression work, we see a correlation between a person's emotional state and a person's physiological measurements. Selected works from Ekman and others on measuring facial behaviors describe Ekman's Facial Action Coding System (Ekman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse, galvanic skin response (GSR), temperature, somatic movement and blood pressure. He then recorded the measurements as the participants were instructed to mimic facial expressions which corresponded to the six basic emotions. He defined the six basic emotions as anger, fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how physiological measures could be used to distinguish various emotional states.
Six participants were trained to exhibit the facial expressions of the six basic emotions. While each participant exhibited these expressions, the physiological changes associated with affect were assessed. The measures taken were GSR, heart rate, skin temperature and general somatic activity (GSA). These data were then subject to two analyses. For the first analysis, a multidimensional scaling (MDS) procedure was used to determine the dimensionality of the data. This analysis suggested that the physiological similarities and dissimilarities of the six emotional states fit within a four dimensional model. For the second analysis, a discriminant function analysis was used to determine the mathematic functions that would distinguish the six emotional states. This analysis suggested that all four physiological variables made significant, nonredundant contributions to the functions that distinguish the six states. Moreover, these analyses indicate that these four physiological measures are sufficient to determine reliably a person's specific emotional state. Because of our need to incorporate these measurements into a small, non-intrusive form, we will explore taking these measurements from the hand. The amount of conductivity of the skin is best taken from the fingers. However, the other measures may not be as obvious or robust. We hypothesize that changes in the temperature of the finger are reliable for prediction of emotion. We also hypothesize the GSA can be measured by change in movement in the computer mouse.
IBM Blue Eyes project gets to know users better
By Gerry A. Plaza Inquirer News Service
SAN JOSE, California--Ever think your computer might one day pester you with messages of love or take up arms in a fit of rage over your insensitivity?
A scene right out of an ’80s flick, "Electric Dreams," you infer. Well, it’s not science fiction, this is starting to become a reality.
If researchers at IBM’s Almaden Research Center here are to be believed, we could then soon see computers that actually know you hate them, or in turn appreciate them for a job well done.
Their initiative to make this happen: the Blue Eyes research project currently being implemented by the center’s user systems ergonomic research group (User). Blue Eyes seeks attentive computation by integrating perceptual abilities to computers wherein non-obtrusive sensing technology, such as video cameras and microphones, are used to identify and observe your actions.
As you walk by the computer screen, for example, the camera would immediately "sense" your presence and automatically turn on room lights, the television, or radio while popping up your favorite Internet website on the display.
Part of this project is not only teaching computers how to sense or perceive user action. They are also being programmed to know how users feel--depressed, ecstatic, bored, amused, or anxious--and make a corresponding response. Computers can, on their own, play a funny Flash animation feature to entertain its "master" if it notices a sad look on his or her face.
Voice or sound capabilities can also be integrated, with the computer "talking" to his user about the task at hand or simply acknowledging a command with a respectful, "yes, sir."
In these cases, the computer extracts key information, such as where the user is looking, what he or she is saying or gesturing or how the subject’s emotions are evident with a grip on the pointing device.
These cues are analyzed to determine the user’s physical, emotional, or informational state, which can be used to increase productivity. This is done by performing expected actions or by providing expected information.
Your gaze
A prototype shown to the Inquirer by Blue Eyes researchers at Almaden reveals a nifty computer interface controlled by, well, a simple gaze. By tracking where the person is looking, the computer can make consequent operations it knows the user wants done.
Called the "simple user Internet tracker" or Suitor, the computer tracks eye movement in the display while on the background takes note of applications the user normally utilizes and the websites he or she constantly visits. By having this information, the computer then "serves" its user without any manual operation.
A user may hit www.inquirer.net and gaze at an interesting headline. The computer then notices this. It would pop up the headline’s full story in an instant without any mouseclicks. Another example is visiting a corporate site such as www.ibm.com. When the user sets his sights on "corporate information," a stock ticker suddenly appears showing IBM’s current share price. This ticker also presents "related news stories" pertaining to IBM’s business.
Pong and Magic
Almaden researchers also presented a robot called "Pong" that observes retinal activity and recognizes user presence and respond accordingly with, say, a human-like smile or computer-based action like initializing a desktop PC.
This "attentive" robot created by the User group dramatizes computers’ emerging awareness by sensing people’s presence and reacting to their comings and goings.
"Pong" utilizes a small video camera to provide attention to its user. This camera is part of another prototype named Magic pointing, which allows the eyes to affect the movement of the cursor. This functionality can only allow the cursor to arrive at the point where the user gazes by having him or her touch the pointing device first. If the cursor were to move where the eyes go, the operation would be unproductive and impractical, the researchers say, because this leaves the user exhausted and confused.
Emotion mouse
Not only is User developing prototypes to recognize eye movement, but it is also looking at an intelligent pointing device that directly captures relevant data on the user’s emotional patterns. The computer therefore can either slow down, when he determines the user is anxious or tired, or speed up, when he identifies that the user is in a hurry.
The pointing device is called the emotion mouse, another prototype being developed at Almaden that can track a user’s mood or emotional state and make the computer respond accordingly.
The device can measure heart rate, temperature, galvanic skin response and minute bodily movements and matches them with six emotional states: happiness, surprise, anger, fear, sadness and disgust.
The mouse includes a set of sensors, including infrared detectors and temperature-sensitive chips. These components, User researchers stress, will also be crafted into other commonly used items such as the office chair, the steering wheel, the keyboard and the phone handle. Integrating the system into the steering wheel,