Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Human-Inspired Robotic Grasp Control
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Human-Inspired Robotic Grasp Control With Tactile Sensing

[attachment=22439]

Abstract

We present a novel robotic grasp controller that allows
a sensorized parallel jaw gripper to gently pick up and set
down unknown objects once a grasp location has been selected.
Our approach is inspired by the control scheme that humans employ
for such actions, which is known to centrally depend on tactile
sensation rather than vision or proprioception.Our controller processes
measurements from the gripper’s fingertip pressure arrays
and hand-mounted accelerometer in real time to generate robotic
tactile signals that are designed to mimic human SA-I, FA-I, and
FA-II channels. These signals are combined into tactile event cues
that drive the transitions between six discrete states in the grasp
controller: Close, Load, Lift and Hold, Replace.

INTRODUCTION


AS ROBOTS move into human environments, they will
need to know how to grasp and manipulate a very wide
variety of objects [1]. For example, some items may be soft
and light, such as a stuffed animal or an empty cardboard box,
while others may be hard and dense, such as a glass bottle or
apple. After deciding where such objects should be grasped
(finger placement), the robot must also have a concept of how to
execute the grasp (finger forces and reactions to changes in the
grasp state). A robot that operates in the real world must be able
to quickly grip a wide variety of objects firmly,without dropping
them, and delicately, without crushing them (see Fig. 1).

Human Grasping

Neuroscientists have thoroughly studied the human talent
to grasp and manipulate objects. As recently reviewed by
Johansson and Flanagan [4], human manipulation makes great
use of tactile signals from several different types of mechanoreceptors
in the glabrous (nonhairy) skin of the hand, with vision
and proprioception providing information that is less essential.

BACKGROUND

The development of tactile sensors for robotic hands has been
a very active area of research, as reviewed by Cutkosky et al.
[9]. Among the wide range of sensors one could use, Dahiya
et al. [10] present a strong case for the importance of having
tactile sensors capable of reproducing the rich response of the
human tactile sensing system. Along these lines, some recent
sensors have even achieved dynamic response [11] and spatial
coverage [12] that are comparable to the human fingertip. Using
tactile sensory cues with wide dynamic range as a trigger for
robotic actions was first proposed by Howe et al. over two
decades ago [13]. Unfortunately, most such sensors still exist
only as research prototypes and are not widely available. The
pressure-sensing arrays that are used in our study represent the
state of the art in commercially available tactile sensors, and they
are available on all PR2 robots. High-bandwidth acceleration
sensing is far more established, although such sensors are rarely
included in robotic grippers.


ROBOT EXPERIMENTAL SYSTEM

Robots have great potential to perform useful work in everyday
settings, such as cleaning up a messy room, preparing
and delivering orders at a restaurant, or setting up equipment
for an outdoor event [1]. Executing such complex tasks requires
hardware that is both capable and robust. Consequently, we use
the Willow Garage PR2 robotic platform. As shown in Fig. 1,
the PR2 is a human-sized robot that is designed for both navigation
and manipulation. It has an omnidirectional wheeled
base, two seven-degree-of-freedom (DOF) arms, and two one-
DOF parallel-jaw grippers. Its extensive noncontact sensor suite
includes two stereo camera pairs, an LED pattern projector,
a high-resolution camera, a camera on each forearm, a headmounted
tilting laser range finder, a body-mounted fixed laser
range finder, and an intertial measurement unit (IMU).

LOW-LEVEL SIGNALS AND CONTROL

Individual sensor readings and actuator commands are far
removed from the task to delicately pick up an object and set
it back down on a table. Consequently, the high-level grasp
controller that is diagrammed in Fig. 2 rests on an essential
low-level processing layer that encompasses both sensing and
acting. Here, we describe the three tactile sensory signals that
we designed to mimic human SA-I, FA-I, and FA-II afferents,
along with the position and force controllers that are needed for
the gripper to move smoothly and interact gently with objects.
All signal filtering and feedback control was done within the
1-kHz soft-real-time loop.


CONCLUSION

This paper has introduced a set of sensory signals and control
approaches that attempt to mimic the human reliance on
cutaneous sensations during grasping tasks. We have presented
a framework that highlights how tactile feedback can be used as
a primary means to complete a manipulation task, with encouraging
results. While it is clear to the authors that not all tasks
can be completed solely via tactile information, we feel that it
is a promising and underutilized tool in robotic manipulation
systems.
In future work, we hope to add additional sensing modalities
into our object handling framework, including estimates
of the necessary object grip force from visual and laser recognition,
audio feedback about crushing and damaging objects,
weight sensing after an object has been lifted off the table, grasp
disturbance prediction from arm motion data, grasp quality information
that is based on the fingerpad contacts, and combining
these data together into higher level percept information. Our
current estimate of the initial grip force necessary to lift an object
depends solely on the hardness information gleaned during
contact.While it has shown to be a strong indicator for many everyday
objects, it does have certain failure cases where hardness
information is deceptive, such as soft but heavy objects (e.g., a
heavy trash bag or a stuffed animal) and light but hard objects
(e.g., an egg or a thin wine glass). Supplementing this information
with additional object data will likely lead to a superior grip
force estimator.
Our signal to detect slip information was successful, but
the force response to slip events could be improved by using
a gripper with superior dynamic response. The PR2 gripper
is admittedly inferior to several of the compliant and wellmodeled
designs existing in the literature; we hypothesize that
these methods would be even more successful if implemented
with these alternative research systems. Furthermore, Takahashi
et al. [19] have shown that it is possible to obtain useful centroidof-
contact informationwith spherical fingertips. This is a feature
we were unable to reproduce with the PR2’s flat fingertips, but
we may attempt to redesign the fingertip shape in the future if
it proves highly beneficial. The addition of shear-force sensing
capabilities to the fingertip may also prove an important indicator
of slip information, and it is close parallel to an important
mechanoreceptor of the hand we do not currently mimic, i.e.,
the SA-II channel, which detects skin stretch.
The implementation presented in this paper deals with the
case of a two-fingered parallel-jaw gripper. The grasp strategies
with such a hand are limited, and this is one aspect of why
we have focused so narrowly on simple pick-and-place tasks.
As more advanced multifingered robot hands come into use,
we see these same contact signals that we have described being
extremely useful in less task-specificways.Research is currently
underway to develop a tactile-event-driven state machine for
object interaction, which should allow for rich feedback during
a wider variety of interesting multifingered interactions and arm
motions. The work presented in this paper is actively used in
multiple PR2 robots around the world. As we continue to refine
this system and increase the range of objects it can handle,
all relevant code is freely available at https://code.ros, and
we encourage interested researchers to contact us about porting
these capabilities to additional robot platforms.