Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Humanoid Multimodal Tactile-Sensing Modules
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Humanoid Multimodal Tactile-Sensing Modules

[attachment=34605]

Abstract

In this paper, we present a new generation of active
tactile modules (i.e., HEX-O-SKIN), which are developed in order
to approach multimodal whole-body-touch sensation for humanoid
robots. To better perform like humans, humanoid robots
need the variety of different sensory modalities in order to interact
with their environment. This calls for certain robustness and fault
tolerance as well as an intelligent solution to connect the different
sensory modalities to the robot. Each HEX-O-SKIN is a small
hexagonal printed circuit board equipped with multiple discrete
sensors for temperature, acceleration, and proximity. With these
sensors, we emulate the human sense of temperature, vibration,
and light touch. Off-the-shelf sensors were utilized to speed up
our development cycle; however, in general, we can easily extend
our design with new discrete sensors, thereby making it flexible
for further exploration. A local controller on each HEX-O-SKIN
preprocesses the sensor signals and actively routes data through
a network of modules toward the closest PC connection. Local
processing decreases the necessary network and high-level processing
bandwidth, while a local analog-to-digital conversion and
digital-data transfers are less sensitive to electromagnetic interference.
With an active data-routing scheme, it is also possible to
reroute the data around broken connections—yielding robustness
throughout the global structure while minimizing wirings. To support
our approach, multiple HEX-O-SKIN are embedded into a
rapid-prototyped elastomer skin material and redundantly connected
to neighboring modules by just four ports.

INTRODUCTION

Human Skin—Humanoid’s Archetype

GETTING in contact, for humans, is informative in many
ways. On willingly or accidentally touching objects around us, we are estimating contact and object properties [1].
This helps us to classify objects and learn more about our complex
environment and how we can safely interact with it [2].
Together with muscular, joint, and internal body sensors, skin
sensitivity makes up a large part of our proprioceptive system,
thereby assisting us in planning tasks and performing motion
control [3]. In order to perform these tasks, human skin is
equipped with a large number of different receptors located
in different layers of the skin [4]. Approximately 5 million free
nerve endings are embodied in different structures to transduce
light and deep pressure, heat or cold, shear stress, vibration,
and physical or chemical danger to the skin material [5]. Interestingly,
the processing scheme begins at the receptor itself by
adapting to constant excitation [6]. This local preprocessing is
followed by reflex loops located in the spinal cord [5]. Finally,
all tactile signals are fused together in the brain with information
from other sensory systems, like vision and audition [7].

Related Work

Various approaches have been carried out by previous
projects. This shows the complexity involved in enabling the
sense of touch to humanoid robots. Every project made slightly
different compromises with regard to sensor density, different
modalities, development and production costs, robustness, usability,
and many other criteria. For a recent survey, see [1].
Here, we highlight some aspects related to our work.

SYSTEM DESCRIPTION

Our Approach

Our approach brings forward advantages of various preceding
projects (as outlined in the previous section). Here, we focus on
the infrastructure that is needed to support large areas of multimodal
humanoid touch sensation. To this end, an intelligent sensor
module was developed along with an field-programmablegate-
array (FPGA) based interface card that links a network of
tactile modules (i.e., HEX-O-SKIN) to a computer-based processor
and its robot controller (as explained in Section II-B).

System Overview

Our system is separated into multiple hardware subsystems
(see the overview in Fig. 2). The tactile sensation starts at the
HEX-O-SKIN, which is a small hexagonal printed circuit board
(PCB) with transducers for every sensor modality, and a local
controller to convert and preprocess sensor signals. Every HEXO-
SKIN has four ports, each providing a power and an universalasynchronous-
receiver-transmitter (UART) connection that can
be used to connect it to the neighboring modules (for details,
see Section II-C).
a) Skin patches: These are multiple HEX-O-SKIN modules
embedded into a piece of elastomer. Within a skin patch, data
packages are routed actively from neighbor to the next by the
local controller. As the boundary of every skin patch provides
ports from the outer tactile modules, it is possible to directly
connect skin patches. In order to cover a segment of the robot,
one can design a specific skin patch or use standard forms like
our prototype skin patch (see Fig. 1).

Acceleration—Impact Reaction

Safety is very important when a robot interacts with people
or the environment. Independent of the robot force sensors
(i.e., tactile or joint), we wanted to detect an impact with another
object. As a robot segment normally moves smoothly,
we can discriminate unexpected impacts with objects from the
absolute-acceleration-change rate. To demonstrate this effect,
we programmed the robot to go in the opposite direction whenever
a constant magnitude thresholdwas hit on the accelerometer
axis normal to the tactile module plane. As an impact has influence
on the acceleration of the whole segment and the exited
vibrations are partially conducted through the frame, it was possible
to use a single accelerometer to detect impacts at various
segment locations and even across segments. Fig. 11 shows an
example of impact signals. (Footage of the experiment is also
given in the submitted video.)

CONCLUSION

In this paper, we presented a concept to sensorize the skin
of a humanoid robot based on a robust network of intelligent
multimodal sensor modules (i.e., HEX-O-SKIN) and a control
architecture to fuse the module data into robot reactions. We
then introduced a prototype network made of eight modules,
which was integrated on a KUKA lightweight robotic arm.With
this experimental setup, we presented experimental results of
the robot reacting to the lightest touch, multiple-touch inputs,
balancing a cup on a plate at the end-effector, and reacting to
impacts and air drafts.
Our contribution to humanoid sensing is a more systematic
approach to technically realize multimodal touch sensation for
an entire robot. In our preparation, we designed and prototyped a
network of smallmodules with local preprocessing and multiple
sensor modalities that demonstrated the effectiveness of our
investigation. We simplified the interface between neighboring
modules and added redundancy; thus, our skin can isolate local
failures and automatically continue its operation.