06-10-2012, 05:42 PM
Drishti: An Integrated Navigation System for Visually Impaired and Disabled
Drishti An Integrated.pdf (Size: 900.19 KB / Downloads: 65)
Abstract
Drishti is a wireless pedestrian navigation system. It
integrates several technologies including wearable
computers, voice recognition and synthesis, wireless
networks, Geographic Information System (GIS) and
Global positioning system (GPS). Drishti augments
contextual information to the visually impaired and
computes optimized routes based on user preference,
temporal constraints (e.g. traffic congestion), and
dynamic obstacles (e.g. ongoing ground work, road
blockade for special events). The system constantly
guides the blind user to navigate based on static and
dynamic data. Environmental conditions and landmark
information queried from a spatial database along their
route are provided on the fly through detailed explanatory
voice cues. The system also provides capability for the
user to add intelligence, as perceived by the blind user, to
the central server hosting the spatial database. Our
system is supplementary to other navigational aids such
as canes, blind guide dogs and wheel chairs.
Introduction
The main motivation behind this navigational system is
one of the author’s fathers; Dr. Theral Moore is blind
from a young age. He is currently an Associate Professor
in the department of Mathematics at the University of
Florida. He specializes in Topology and Number Theory.
He no longer has any sight and cannot even detect light.
It is our belief that recent advances in technologies could
help and facilitate in the day-to-day operations of visually
impaired and disabled people.
Related Work
Initial efforts in augmented reality dealt with seethrough
head mounted display for assistance in
applications like aviation, surgery, maintenance and
repair; building restoration work and parts assembly [8,
18]. The common feature of all these applications is
some sort of precise tracking. Further, such applications
are usually restricted to small operating zones and
tethered to a fixed network.
In 1991, Golledge et al; were the earliest to propose
the use of GIS, GPS, speech, and sonic sensor
components for blind navigation in a progress notes on
the status of GIS [11]. MOBIC, is a GPS based travel aid
for blind and elderly. It also uses a speech synthesizer to
recite the predetermined travel journey plans [13]. This
test prototype is implemented on a handheld computer
with preloaded digital maps and limited wireless
capabilities to get latest information from a remote
database. A similar system was implemented by
Golledge et al; using a wearable computer [10].
Other terrestrial navigation support using augmented
reality is developed for the sighted people. Metronaut, is
a CMU’s campus visitor assistant that uses a bar code
reader to infer its position information from a series of bar
code labels placed at strategic locations of the campus
[16]. Similar kind of systems are developed by
researchers in which current position of the user is used to
overlay textual annotation and relevant information from
their web servers to coincide with the image captured
through their head mounted display [8, 18].
Problem Domain
When people walk from one place to another, they
make use of several different inputs. When a visually
impaired person walks from one building to another on
campus, he would lack many useful inputs. Our goal is to
develop a system to augment a visually impaired person’s
pedestrian experience with enough information to make
them feel comfortable on a walk from one location to
another. A map of the study area of the University of
Florida (UF) campus is shown in Figure 1. For brevity
and clarity only major layers of our GIS database and
only a portion of the study area are shown. The study
area covers about one fourth of the actual campus. To
evaluate the efficiency of the prototype, it was made sure
to select an area to include various scenarios such as
crowded walkways, close buildings, services etc.
System Design
In designing our prototype we made use of
Commercial-Off-The-Shelf (COTS) hardware and
software. This helped us in focusing on the functionality
of the system. Our prototype weighs approximately 8 lbs,
which we believe most blind and disabled persons can
carry. The backpack is designed in a way to distribute the
load evenly.
Figure 6 depicts a user (Steve) using the Drishti
prototype on a test run. The wearable computer along
with the GPS receiver and electronic compass are placed
in the backpack. The user wears the head mounted
display for visual tracking (disabled) and the integrated
headset for speech I/O (blind).
System Architecture
Figure 8 describes the client/proxy/server architecture
of our prototype. The server manages incoming requests
from Mobile Clients, through a server-side proxy known
as the client manager. Each client manager acts as a
gateway between the client it supports and the GIS
database. The server and client manager were developed
using Java. The proxy shields the mobile clients from the
details and software requirements of server. This was
important to keep the mobile client simple and to control
its footprint. Client managers make spatial queries and
add new spatial information via ESRI's spatial database
engine's Java API. The GIS database is exposed to
various campus departments such as the University
Police, Physical Plant and Special Events, to provide them
with the ability to insert and remove dynamic obstacles,
and monitor the campus for inputs added by individual
user for its veracity.
Conclusion and Future Works
We have developed Drishti, meaning Vision in the
ancient Indian language Sanskrit, a wireless pedestrian
navigation system for the visually impaired and disabled.
Most systems that have been developed so far lack the
kind of dynamic interaction and adaptability to changes
that our system provides to the user. We also emphasize
contextual awareness that we believe is very important to
enhancing the navigational experience, especially for the
blind user.
To augment this system we are exploring indoor
navigation techniques. Currently our GIS database has
building plans registered with the rest of the layers. This
facilitates smooth outdoor/indoor navigational handoffs.
Further these building plans have extensive information
like fire exits, seat arrangement in a classroom, elevator
location, stairs with number of steps etc., based on strict
building codes. The indoor navigation module is still in
testbed phase. We are evaluating various sensors for
relative positioning inside the buildings. We would also
like to explore the use of sensors to update pedestrian
traffic levels to constantly monitor and update our
database directly.