19-05-2012, 02:03 PM
Haptics: The Technology of Touch
Salisbury_Haptics95.pdf (Size: 12.61 KB / Downloads: 82)
Introduction
Haptics -- the newest technology to arrive in the world of computer interface devices -- promises to bring
profound changes to the way humans interact with information and communicate ideas. Recent advances in
computer interface technology now permit us to touch and manipulate imaginary computer-generated objects
in a way that evokes a compelling sense of tactile "realness."
With this technology we can now sit down at a computer terminal and touch objects that exist only in the
"mind" of the computer. These interactions might be as simple as touching a virtual wall or button, or as
complex as performing a critical procedure in a surgical simulator.
This paper outlines some of the history that has led to the current state-of-the-art in haptics, the emergence of
a new haptic interface device, the application potential it offers, and the enabling technologies that will permit
pervasive use of haptics.
Sensing and Manipulation: A Co-Dependence
The term "haptics" has been used for years by researchers in human psychophysics who study how people
use their hands to sense and manipulate objects. Unique among our sensory modalities, haptics relies on
action to stimulate perception. To sense the shape of a cup we do not take a simple tactile snapshot and go
away to think about what we felt. Rather, we grasp and manipulate the object, running our fingers across its
shape and surfaces in order to build a mental image of a cup. This co-dependence between sensing and
manipulation is at the heart of understanding how humans can so deftly interact with the physical world.
Recently the term "haptic interfaces" has begun to be used by human interface technologists to describe
devices that measure the motions of, and stimulate the sensory capabilities within, our hands. There is a long
and respectable history in the development of devices to permit humans to control remotely located robots
(tele-manipulators). Yet, it has taken the explosion of computer capability and the yearning for better ways to
connect to newly complex computer-generated worlds to drive the creation and development of practical
devices for haptic interaction.
A Feel for Remote Control
For as long as people have been remotely controlling machines, devices have been built to give us a sense of
feel when controlling remote actions. Even before the days when we needed to safely manipulate hazardous
radioactive materials, remote manipulation devices had been built. Early on these took the form of simple
lever- and cable-actuated tongs on the end of a pole. These evolved into mechanical contrivances with
elbows, wrists and crude hands. So-called "hot-cell manipulators" enabled workers to grasp a flask and pour a
dangerous liquid. The worker could move, orient and squeeze a simple pistol grip to control the remotely
located "tongs" to perform the work. Mechanical links and cables communicated motions and forces between
the humans and a remote hand.
Early researchers quickly recognized the need to transmit these motions and forces with as much fidelity and
speed (bandwidth) as possible. They struggled to find ways to eliminate friction and sloppiness in the
mechanical actions. From the works of pioneers such as Mosher, Goertz, Vertut, Flatau and many others
came designs that permitted remote manipulation of objects with a high degree of dexterity; many of these
designs are still in use in the nuclear and hazardous material industries.
As the need for more distant remote manipulation arose, researchers developed designs that eliminated the
direct mechanical connection between the master and remote devices. Using motors and simple electronic
sensors, it became possible to connect human hand actions to a remote manipulator via electronic signals.
Within these devices, motors provided the force both to perform the task and to provide the user with the feel
of doing the task.
Toward Computer-Generated Reality
At some point, it was realized that if correct electrical signals could be generated by a computer, the master
device could be used to make users feel as though they were performing a real task. In reality, users would
simply be interacting through motors with a computer program. Early experiments were conducted by Knoll at
AT&T, Kilpatrick and Brooks at The University of North Carolina at Chapel Hill, and Wilson at The University
of California - San Diego. They demonstrated that the sense of touching simple shapes could indeed be
evoked by programming computers to control electromechanical master devices.
Salisbury_Haptics95.pdf (Size: 12.61 KB / Downloads: 82)
Introduction
Haptics -- the newest technology to arrive in the world of computer interface devices -- promises to bring
profound changes to the way humans interact with information and communicate ideas. Recent advances in
computer interface technology now permit us to touch and manipulate imaginary computer-generated objects
in a way that evokes a compelling sense of tactile "realness."
With this technology we can now sit down at a computer terminal and touch objects that exist only in the
"mind" of the computer. These interactions might be as simple as touching a virtual wall or button, or as
complex as performing a critical procedure in a surgical simulator.
This paper outlines some of the history that has led to the current state-of-the-art in haptics, the emergence of
a new haptic interface device, the application potential it offers, and the enabling technologies that will permit
pervasive use of haptics.
Sensing and Manipulation: A Co-Dependence
The term "haptics" has been used for years by researchers in human psychophysics who study how people
use their hands to sense and manipulate objects. Unique among our sensory modalities, haptics relies on
action to stimulate perception. To sense the shape of a cup we do not take a simple tactile snapshot and go
away to think about what we felt. Rather, we grasp and manipulate the object, running our fingers across its
shape and surfaces in order to build a mental image of a cup. This co-dependence between sensing and
manipulation is at the heart of understanding how humans can so deftly interact with the physical world.
Recently the term "haptic interfaces" has begun to be used by human interface technologists to describe
devices that measure the motions of, and stimulate the sensory capabilities within, our hands. There is a long
and respectable history in the development of devices to permit humans to control remotely located robots
(tele-manipulators). Yet, it has taken the explosion of computer capability and the yearning for better ways to
connect to newly complex computer-generated worlds to drive the creation and development of practical
devices for haptic interaction.
A Feel for Remote Control
For as long as people have been remotely controlling machines, devices have been built to give us a sense of
feel when controlling remote actions. Even before the days when we needed to safely manipulate hazardous
radioactive materials, remote manipulation devices had been built. Early on these took the form of simple
lever- and cable-actuated tongs on the end of a pole. These evolved into mechanical contrivances with
elbows, wrists and crude hands. So-called "hot-cell manipulators" enabled workers to grasp a flask and pour a
dangerous liquid. The worker could move, orient and squeeze a simple pistol grip to control the remotely
located "tongs" to perform the work. Mechanical links and cables communicated motions and forces between
the humans and a remote hand.
Early researchers quickly recognized the need to transmit these motions and forces with as much fidelity and
speed (bandwidth) as possible. They struggled to find ways to eliminate friction and sloppiness in the
mechanical actions. From the works of pioneers such as Mosher, Goertz, Vertut, Flatau and many others
came designs that permitted remote manipulation of objects with a high degree of dexterity; many of these
designs are still in use in the nuclear and hazardous material industries.
As the need for more distant remote manipulation arose, researchers developed designs that eliminated the
direct mechanical connection between the master and remote devices. Using motors and simple electronic
sensors, it became possible to connect human hand actions to a remote manipulator via electronic signals.
Within these devices, motors provided the force both to perform the task and to provide the user with the feel
of doing the task.
Toward Computer-Generated Reality
At some point, it was realized that if correct electrical signals could be generated by a computer, the master
device could be used to make users feel as though they were performing a real task. In reality, users would
simply be interacting through motors with a computer program. Early experiments were conducted by Knoll at
AT&T, Kilpatrick and Brooks at The University of North Carolina at Chapel Hill, and Wilson at The University
of California - San Diego. They demonstrated that the sense of touching simple shapes could indeed be
evoked by programming computers to control electromechanical master devices.