27-11-2012, 05:56 PM
BLUE EYES
BLUE EYES.doc (Size: 1.45 MB / Downloads: 45)
INTRODUCTION
The BLUE EYES project was started at IBM's Almaden Research Centre in USA. It aims at giving computers highly developed abilities to perceive, integrate and interpret visual, auditory and touch information. The project explores various ways of allowing people to operate computers without conscious effort. Gaze tracking seemed like the natural place to start. This is because the eyes are the most expressive part in a human being and all the emotions are reflected in the eyes.
These days it is the humans who have to adapt to the computers by learning various languages, modes of operations etc. The BLUE EYES project aims at creating computers, which can adapt to humans and thus enable them to operate much more conveniently. The project enables computers and humans to work together more as partners.
Effective utilization of existing Biometric techniques for the purpose of the emotion detection is done in the Blue Eyes project. Different Biometric sensors are used to monitor different parts of the human body. By proper processing of the output of these sensors the emotion of a person is identified. The Blue Eyes uses non-obtrusive sensing technologies such as video cameras and microphones, to identify user's action and to extract key information. These clues are analyzed to determine the user's physical, emotional or informational state.
BIOMETRICS
The computers must be given power to sense emotion of its user, in order to make them 'attentive computers'. BIOMETRICS is the science used for the implementation. It is the science by which we measure the physiological and behavioral characteristics of a person. And by using these characteristics the emotional state of the user is identified.
The eyes are the most expressive part of a human being. So iris scanning is the most important technique used. The emotions of a human being will directly reflect in his physiological attributes. So the measure of heartbeat, blood pressure etc. will give straight information about the emotional state of the user. Several techniques like eye gaze tracking, facial expression detection, speech recognition, detection using Emotion mouse, Jazz multi sensor etc. are used for emotion detection.
Besides the emotion detection, the Biometric techniques can be used for security purposes. The physical characteristics like the finger prints, hand geometry, retina, voice etc. are unique for every human being. Thus by analyzing these characteristics a person can be easily identified. Thus several methods like fingerprint identification, retinal scan, voice identification etc are successfully implemented for the purpose of security. Here in the implementation of Blue Eyes, the biometric techniques are used mainly for the purpose of emotion detection.
Face Recognition
Face recognition is applied in a variety of domains, predominantly for security. The user's face must be identified before further processing. The first problem to be solved before attempting face recognition is to find the face in the image. The first stage of the process is color segmentation, which simply determines if the proportion of the skin tone pixels is greater than some threshold. Subsequently candidate regions are given scores.Next instead of searching for all the facial features directly in the face image, a few 'high level' features (eyes, nose, mouth) are first located.
Then other 26 'low level' features that may be parts of eyes, nose, mouth, eyebrows etc. are located with respect to high level feature locations. The approximate locations of the high level features are known from statistics of mean and variance relative to the nose position, gathered on the training database
Expression Detection
The facial expression determination is a field in which fast researches are going on. The most intriguing invention is Expression Glasses. This is a mobile device which can be worn by the user. This is very comfortable to wear and the survey about it among the subject users provide a good result. This device measures the movement of face muscles. The movement is then compared with some reference index to determine emotion.
This device is used to determine the user's level of interest or confusion by measuring the movement of muscles around the eyes. The output from the expression
glasses is fed to the computer for further processing. There it is converted into a two colored bar graph, in which red bars indicate confusion and green for interest. This graph will give a clear indication about the level of interest or confusion.
Speech Recognition
Speech recognition is the process of converting a speech signal to a set of words, by means of an algorithm implemented as computer program. Voice or speaker identification is a related process that attempts to identify the person speaking, as opposed to what is being said.
Speech is processed by means of complex voice processing algorithms. First the speech signal is converted into a set of words, by proper sampling, quantization and coding. These words are called voice prints. There is already a reference index which contains different voice prints corresponding to each emotion. By comparing the subject user's voice print with these reference the emotion is identified. Mainly the tone of the voice is compared, besides what is being said.
Emotion Mouse
A non-invasive way to obtain the information about the user's emotional state is through touch. People use their computer to store and manipulate data. The proposed method for obtaining user information through touch is via a computer input device, the mouse. The computer determines the user's emotional state by a simple touch. Sensors in the mouse sense physiological attributes, which are correlated to emotions using a correlation model.
The emotion mouse consists of a number of sensors which will sense individual attributes. The different sensors incorporated in the emotion mouse are IR sensor, thermosister chip, galvanic sensor. The IR sensor will measure the heartbeat from the fingertip, the thermosister chip will measure the body temperature and galvanic sensor will take the measure of skin conductivity. All these attributes are combined to form a vector which is the representative of the emotional state of the user.
Jazz Multi Sensor
Jazz multi sensor is a sensor which will sense multi attributes. This sensor is a mobile device that the user can wear it on his forehead. This device is a marvelous one which has multi sensors incorporated on it. This device uses all the techniques of emotion detection described above.
This single device is capable of detecting a subject user's emotional state. This device is developed at the research laboratory of Poznan University, Poland. The different sensors in a Jazz sensor are IR sensor, oculographic transducer, environment illumination sensor, expression glass, microphone etc. The different sensors senses different physiological attributes which will give a complex output.
The plethysmographic signals, which are the signals from cardiac, circulatory and pulmonary systems, will give direct indication of the emotional state. The sensors sensing these signals are collectively known as plethysmograhic transducers.
The IR sensor senses the heartbeat and level of blood oxygenation. The heartbeat pulse rate is calculated at the analysis section by making use of the level of oxyhaemoglobin and de-oxyhaemoglobin sensed by the Jazz sensor.
BLUE EYES HARDWARE
Blue eyes system provides technical means for monitoring and recording the operator's physiological parameters . The Blue Eyes hardware consists of mainly two parts. They are the Data Acquisition Unit and The Central System Unit.
System Overview
Data Acquisition Unit is a mobile measuring device, which consists of a number of modules like physiological parameter sensor, voice interface, ID card interface etc. ID card assigned to each of the operators and adequate user profdes on the central system unit provide necessary data personalization, so different people can use a single mobile device. The mobile device is integrated with Bluetooth module for providing wireless interface between sensors worn by the operator and the central unit.
Central System Unit actually provides the real-time buffering of incoming sensor signals and semi-real-time processing of the data. It consists of different data modules for the proper functioning . The overall system diagram is shown below. The explanation for individual components is as follows.
Data Acquisition Unit
Data Acquisition Unit is the mobile part of the Blue Eyes hardware. Main tasks of mobile Data Acquisition Unit are to maintain Bluetooth connections, receive the information from the sensor and sending it over the wireless connection, to deliver the alarm messages send from the central system unit to the operator and handle personalized ID cards.
The components which constitute the data acquisition unit are Atmel 89C52, a PCM Codec, personal ID card interface, Jazz multisensor interface, a beeper, an LCD display, LED indicators a simple keyboard, and finally a Bluettoth modue. The arrangement of these blocks to realize a DAU is picturized below.
Personnel ID card interface.
ID cards are assigned to each of the operators and corresponding adequate user profiles on the central system unit provides necessary data personalization, so different people can use the single mobile DAU. In order to start the proper functioning of the DAU the operator should insert the personal ID card. After inserting the ID card into the mobile device and entering proper PIN code, the device will start listening for incoming Bluetooth connections. Once the connection has been established and authorization process is succeeded (PIN code is correct) central system start monitoring the output of the DAU.
Operator Manager
The data of each supervised operator is buffered separately in the dedicated Operator Manager. At the startup it communicates with the Operator Data Manager in order to get more detailed personal data. The most important Operator Manager's task is to buffer the incoming raw data and to split it into separate data streams related to each of the measured parameters.
The raw data is sent to a Logger Module, the split data streams are available for the other system modules through producer-consumer queues. Furthermore, the Operator Manager provides an interface for sending alert messages to the related operator.
Operator Data Manager provides an interface to the operator database enabling the other modules to read or write personal data and system access information.
Data Logger Module
The module provides support for storing the monitored data in order to enable the supervisor to reconstruct and analyze the course of the operator's duty. The module registers as a consumer of the data to be stored in the database. Each working operator's data is recorded by a separate instance of the Data Logger. Apart from the raw or processed physiological data, alerts and operator's voice are stored. The raw data is supplied by the related Operator Manager module, whereas the Data Analysis module delivers the processed data.
The voice data is delivered by a Voice Data Acquisition module. The module registers as an operator's voice data consumer and optionally processes the sound to be stored (i.e. reduces noise or removes the fragments when the operator does not speak). The Logger's task is to add appropriate time stamps to enable the system to reconstruct the voice.
Data Analysis Module
The module performs the analysis of the raw sensor data in order to obtain information about the operator's physiological condition. The separately running Data Analysis Module supervises each of the working operators. The module consists of a number of smaller analyzers extracting different types of information. Each of the analyzers registers at the appropriate Operator Manager or another analyzer as a data consumer and, acting as a producer, provides the results of the analysis.
Dispatcher Module Alarm
Alarm Dispatcher Module is a very important part of the Data Analysis module. It registers for the results of the data analysis, checks them with regard to the user-defined alarm conditions and launches appropriate actions when needed. The module is a producer of the alarm messages, so that they are accessible in the logger and visualization modules.
Visualization Module
The module provides user interface for the supervisors. It enables them to watch each of the working operator's physiological condition along with a preview of selected video source and his related sound stream. All the incoming alarm messages are instantly signaled to the supervisor. Moreover, the visualization module can be set in the off-line mode, where all the data is fetched from the database. Watching all the recorded physiological parameters, alarms, video and audio data the supervisor is able to reconstruct the course of the selected operator's duty.
BLUE EYES.doc (Size: 1.45 MB / Downloads: 45)
INTRODUCTION
The BLUE EYES project was started at IBM's Almaden Research Centre in USA. It aims at giving computers highly developed abilities to perceive, integrate and interpret visual, auditory and touch information. The project explores various ways of allowing people to operate computers without conscious effort. Gaze tracking seemed like the natural place to start. This is because the eyes are the most expressive part in a human being and all the emotions are reflected in the eyes.
These days it is the humans who have to adapt to the computers by learning various languages, modes of operations etc. The BLUE EYES project aims at creating computers, which can adapt to humans and thus enable them to operate much more conveniently. The project enables computers and humans to work together more as partners.
Effective utilization of existing Biometric techniques for the purpose of the emotion detection is done in the Blue Eyes project. Different Biometric sensors are used to monitor different parts of the human body. By proper processing of the output of these sensors the emotion of a person is identified. The Blue Eyes uses non-obtrusive sensing technologies such as video cameras and microphones, to identify user's action and to extract key information. These clues are analyzed to determine the user's physical, emotional or informational state.
BIOMETRICS
The computers must be given power to sense emotion of its user, in order to make them 'attentive computers'. BIOMETRICS is the science used for the implementation. It is the science by which we measure the physiological and behavioral characteristics of a person. And by using these characteristics the emotional state of the user is identified.
The eyes are the most expressive part of a human being. So iris scanning is the most important technique used. The emotions of a human being will directly reflect in his physiological attributes. So the measure of heartbeat, blood pressure etc. will give straight information about the emotional state of the user. Several techniques like eye gaze tracking, facial expression detection, speech recognition, detection using Emotion mouse, Jazz multi sensor etc. are used for emotion detection.
Besides the emotion detection, the Biometric techniques can be used for security purposes. The physical characteristics like the finger prints, hand geometry, retina, voice etc. are unique for every human being. Thus by analyzing these characteristics a person can be easily identified. Thus several methods like fingerprint identification, retinal scan, voice identification etc are successfully implemented for the purpose of security. Here in the implementation of Blue Eyes, the biometric techniques are used mainly for the purpose of emotion detection.
Face Recognition
Face recognition is applied in a variety of domains, predominantly for security. The user's face must be identified before further processing. The first problem to be solved before attempting face recognition is to find the face in the image. The first stage of the process is color segmentation, which simply determines if the proportion of the skin tone pixels is greater than some threshold. Subsequently candidate regions are given scores.Next instead of searching for all the facial features directly in the face image, a few 'high level' features (eyes, nose, mouth) are first located.
Then other 26 'low level' features that may be parts of eyes, nose, mouth, eyebrows etc. are located with respect to high level feature locations. The approximate locations of the high level features are known from statistics of mean and variance relative to the nose position, gathered on the training database
Expression Detection
The facial expression determination is a field in which fast researches are going on. The most intriguing invention is Expression Glasses. This is a mobile device which can be worn by the user. This is very comfortable to wear and the survey about it among the subject users provide a good result. This device measures the movement of face muscles. The movement is then compared with some reference index to determine emotion.
This device is used to determine the user's level of interest or confusion by measuring the movement of muscles around the eyes. The output from the expression
glasses is fed to the computer for further processing. There it is converted into a two colored bar graph, in which red bars indicate confusion and green for interest. This graph will give a clear indication about the level of interest or confusion.
Speech Recognition
Speech recognition is the process of converting a speech signal to a set of words, by means of an algorithm implemented as computer program. Voice or speaker identification is a related process that attempts to identify the person speaking, as opposed to what is being said.
Speech is processed by means of complex voice processing algorithms. First the speech signal is converted into a set of words, by proper sampling, quantization and coding. These words are called voice prints. There is already a reference index which contains different voice prints corresponding to each emotion. By comparing the subject user's voice print with these reference the emotion is identified. Mainly the tone of the voice is compared, besides what is being said.
Emotion Mouse
A non-invasive way to obtain the information about the user's emotional state is through touch. People use their computer to store and manipulate data. The proposed method for obtaining user information through touch is via a computer input device, the mouse. The computer determines the user's emotional state by a simple touch. Sensors in the mouse sense physiological attributes, which are correlated to emotions using a correlation model.
The emotion mouse consists of a number of sensors which will sense individual attributes. The different sensors incorporated in the emotion mouse are IR sensor, thermosister chip, galvanic sensor. The IR sensor will measure the heartbeat from the fingertip, the thermosister chip will measure the body temperature and galvanic sensor will take the measure of skin conductivity. All these attributes are combined to form a vector which is the representative of the emotional state of the user.
Jazz Multi Sensor
Jazz multi sensor is a sensor which will sense multi attributes. This sensor is a mobile device that the user can wear it on his forehead. This device is a marvelous one which has multi sensors incorporated on it. This device uses all the techniques of emotion detection described above.
This single device is capable of detecting a subject user's emotional state. This device is developed at the research laboratory of Poznan University, Poland. The different sensors in a Jazz sensor are IR sensor, oculographic transducer, environment illumination sensor, expression glass, microphone etc. The different sensors senses different physiological attributes which will give a complex output.
The plethysmographic signals, which are the signals from cardiac, circulatory and pulmonary systems, will give direct indication of the emotional state. The sensors sensing these signals are collectively known as plethysmograhic transducers.
The IR sensor senses the heartbeat and level of blood oxygenation. The heartbeat pulse rate is calculated at the analysis section by making use of the level of oxyhaemoglobin and de-oxyhaemoglobin sensed by the Jazz sensor.
BLUE EYES HARDWARE
Blue eyes system provides technical means for monitoring and recording the operator's physiological parameters . The Blue Eyes hardware consists of mainly two parts. They are the Data Acquisition Unit and The Central System Unit.
System Overview
Data Acquisition Unit is a mobile measuring device, which consists of a number of modules like physiological parameter sensor, voice interface, ID card interface etc. ID card assigned to each of the operators and adequate user profdes on the central system unit provide necessary data personalization, so different people can use a single mobile device. The mobile device is integrated with Bluetooth module for providing wireless interface between sensors worn by the operator and the central unit.
Central System Unit actually provides the real-time buffering of incoming sensor signals and semi-real-time processing of the data. It consists of different data modules for the proper functioning . The overall system diagram is shown below. The explanation for individual components is as follows.
Data Acquisition Unit
Data Acquisition Unit is the mobile part of the Blue Eyes hardware. Main tasks of mobile Data Acquisition Unit are to maintain Bluetooth connections, receive the information from the sensor and sending it over the wireless connection, to deliver the alarm messages send from the central system unit to the operator and handle personalized ID cards.
The components which constitute the data acquisition unit are Atmel 89C52, a PCM Codec, personal ID card interface, Jazz multisensor interface, a beeper, an LCD display, LED indicators a simple keyboard, and finally a Bluettoth modue. The arrangement of these blocks to realize a DAU is picturized below.
Personnel ID card interface.
ID cards are assigned to each of the operators and corresponding adequate user profiles on the central system unit provides necessary data personalization, so different people can use the single mobile DAU. In order to start the proper functioning of the DAU the operator should insert the personal ID card. After inserting the ID card into the mobile device and entering proper PIN code, the device will start listening for incoming Bluetooth connections. Once the connection has been established and authorization process is succeeded (PIN code is correct) central system start monitoring the output of the DAU.
Operator Manager
The data of each supervised operator is buffered separately in the dedicated Operator Manager. At the startup it communicates with the Operator Data Manager in order to get more detailed personal data. The most important Operator Manager's task is to buffer the incoming raw data and to split it into separate data streams related to each of the measured parameters.
The raw data is sent to a Logger Module, the split data streams are available for the other system modules through producer-consumer queues. Furthermore, the Operator Manager provides an interface for sending alert messages to the related operator.
Operator Data Manager provides an interface to the operator database enabling the other modules to read or write personal data and system access information.
Data Logger Module
The module provides support for storing the monitored data in order to enable the supervisor to reconstruct and analyze the course of the operator's duty. The module registers as a consumer of the data to be stored in the database. Each working operator's data is recorded by a separate instance of the Data Logger. Apart from the raw or processed physiological data, alerts and operator's voice are stored. The raw data is supplied by the related Operator Manager module, whereas the Data Analysis module delivers the processed data.
The voice data is delivered by a Voice Data Acquisition module. The module registers as an operator's voice data consumer and optionally processes the sound to be stored (i.e. reduces noise or removes the fragments when the operator does not speak). The Logger's task is to add appropriate time stamps to enable the system to reconstruct the voice.
Data Analysis Module
The module performs the analysis of the raw sensor data in order to obtain information about the operator's physiological condition. The separately running Data Analysis Module supervises each of the working operators. The module consists of a number of smaller analyzers extracting different types of information. Each of the analyzers registers at the appropriate Operator Manager or another analyzer as a data consumer and, acting as a producer, provides the results of the analysis.
Dispatcher Module Alarm
Alarm Dispatcher Module is a very important part of the Data Analysis module. It registers for the results of the data analysis, checks them with regard to the user-defined alarm conditions and launches appropriate actions when needed. The module is a producer of the alarm messages, so that they are accessible in the logger and visualization modules.
Visualization Module
The module provides user interface for the supervisors. It enables them to watch each of the working operator's physiological condition along with a preview of selected video source and his related sound stream. All the incoming alarm messages are instantly signaled to the supervisor. Moreover, the visualization module can be set in the off-line mode, where all the data is fetched from the database. Watching all the recorded physiological parameters, alarms, video and audio data the supervisor is able to reconstruct the course of the selected operator's duty.