Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Skinput full report
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Skinput: Appropriating the Skin as an Interactive Canvas

[attachment=35084]


INTRODUCTION

Devices with significant computational power and
capability
can now be easily carried with us. These devices
have tremendous
potential to bring the power of information,
computation, creation, and communication to a
wider audience
and to more aspects of our lives. However,
this potential raises new challenges for interaction design.
For example, miniaturizing devices has simultaneously
reduced their interactive surface area. This has led to diminutive
screens, cramped keyboards, and tiny jog wheels, all
of which impose restrictions that diminish usability and
prevent us from realizing the full potential of mobile computing.
Consequently, mobile devices are approaching the
computational capabilities of desktop computers, but are
hindered by a human–computer I/O bottleneck.



Always-available input

A primary goal of Skinput is to provide an always-available
mobile input system—for example, an input system
that does not require a user to carry or pick up a device.
A number
of alternative solutions to this problem have
been proposed. Techniques based on computer vision
are popular (e.g., Argyros and Lourakis,2 Mistry et al.,16
Wilson,24, 25 see Erol et al.5 for a recent survey). These,
however, are computationally expensive and error prone
in mobile scenarios
(where, e.g., non-input optical flow is
prevalent), or depend on cumbersome instrumentation
of the hands to enhance performance. Speech input (e.g.,
Lakshmipathy et al.9 and Lyons et al.11) is a logical choice
for always-available input, but is limited in its precision
in unpredictable acoustic environments, suffers from privacy
and scalability issues in shared environments, and
may interfere with cognitive tasks significantly more than
manual interfaces.


Bio-sensing

Skinput leverages the natural acoustic conduction
properties
of the human body to provide an input system,
and is thus related to previous work in the use of biological
signals for computer input. Signals traditionally used
for diagnostic medicine, such as heart rate and skin
resistance,
have been appropriated for assessing a user’s
emotional state (e.g., Mandryk and Atkins,12 Mandryk
et al.,13 Moore and Dua17). These features are generally
subconsciously driven and cannot
be controlled with sufficient
precision for direct input.



Bio-acoustics

When a finger taps the skin, several distinct forms of
acoustic energy are produced. Some energy is radiated
into the air as sound waves; this energy is not captured
by the Skinput system. Among the acoustic energy transmitted
through the arm, the most readily visible are transverse
waves, created by the displacement of the skin from
a finger impact (Figure 3). When shot with a high-speed
camera, these appear as ripples,
which propagate outward
from the point of contact (like a pebble into a pond). The
amplitude of these ripples is correlated to the tapping
force and the volume and compliance of soft tissues under
the impact area. In general, tapping on soft regions of the
arm creates higher-amplitude transverse waves than tapping
on boney areas (e.g., wrist, palm, fingers), which have
negligible compliance.


CONCLUSION
In this paper, we presented our approach to appropriating
the human body as an interactive surface. We have
described a novel, wearable, bio-acoustic sensing approach
that can detect and localize finger taps on the forearm and
hand. Results from our experiments have shown that our
system performs well for a series of gestures, even when
the body is in motion. We conclude with descriptions of
several prototype applications that graze the tip of the rich
design space we believe Skinput enables.