21-05-2012, 10:12 AM
Three-Dimensional Navigationwith Scanning Ladars
Concept& InitialVerification
Three-Dimensional Navigation.pdf (Size: 2.56 MB / Downloads: 31)
INTRODUCTION
This paper focuses on the use of laser radars(LADARs) for navigation of unmanned aerial vehicles(UAVs) in an urban environment. To enable operationof UAVs at any time in any environment a precisionnavigation, attitude, and time (PNAT) capabilityonboard the vehicle is required. This capabilityshould be robust and not solely dependent on the
Global Positioning System (GPS) since GPS maynot be available due to shadowing, significant signalattenuation, and multipath caused by buildings, ordue to intentional denial or deception. The following
operational scenario is considered in this paper. AUAV will take off at a known position in a knownenvironment. After the take-off phase, the UAV willenter an unknown or partially known environment
and starts its mission toward the urban targetenvironment. Upon arrival in the urban environment,the UAV may perform tasks such as surveillance.Navigation during the en-route phase is based on
terrain-referenced navigation (TRN) techniques. Theurban environment is fundamentally different fromthe en-route flight environment; whereas duringthe en-route flight most information is found in the
environment below the UAV, in the urban environmentnavigable information is mostly found around theaerial vehicle. Hence, the UAV platform must becapable of observing features in a wide field-of-view
(FOV): 2D LADAR and 3D imaging sensor areexcellent candidates for this approach.
3DLADAR-BASED NAVIGATION
This paper exploits planar surfaces (planes) asthe basis feature for the 3D navigation solution. Therationale for the use of planes for navigation in 3Durban environments is that planes are common inman-made environments. To exemplify, Fig. 2 showstypical urban indoor (hallway) and outdoor (urbancanyon) images. Multiple planes can be extracted fromboth images as illustrated in Fig. 2. Since changes inimage feature parameters between two different scansare used for navigation, this feature must be observedin both scans. Feature repeatability is thus essentialfor the LADAR-based navigation. Planar surfaces
satisfy this requirement as they are highly repeatablefrom scan to scan. If a wall of a building stays in theLADAR measurement range then the plane associatedwith that wall repeats in the scan images.Fig. 3 illustrates a generic navigation routinethat exploits planar surfaces to derive the navigationsolution. A 3D scan image of the environment is
obtained by a scanning LADAR. Planes are extractedfrom LADAR images and used to estimate thenavigation solution that is comprised of changes inLADAR position and orientation between scans. In
order to use a planar surface for the estimation ofposition and orientation changes from one scan tothe next, this planar surface must be observed in bothscans and it must be known with certainty that a planein one scan corresponds to the plane in the next scan.Hence, the feature matching procedure establishes acorrespondence between planes extracted from thecurrent scan and planes extracted from previous scans.The navigation routine stores planes extracted from
previous scans into the plane list. The plane list isinitially populated at the initial scan. If a new plane isobserved during one of the following scans, the planelist is updated to include this new plane. In [8], INS
data are exploited to match lines extracted from 2DLADAR images for a 2D navigation case. In orderto use INS data for plane matching, line matchingFig. 3. Generic routine of 3D navigation that uses images of
scanning LADAR.algorithms developed in [8] must be extended for
a 3D case. Hence, the feature matching procedurehas to use position and orientation outputs of theINS to predict plane location and orientation in thecurrent scan based on plane parameters observed in
previous scans. If predicted plane parameters matchclosely to the parameters of the plane extracted fromthe current scan, a match is declared and a matchedplane is used for navigation computations. Note
that INS data can be also applied to compensate forLADAR motion during scans for those cases wheresuch motion can introduce significant distortions toLADAR scan images. Following feature matching,changes in parameters of the planes that are matched
between different scans are exploited to estimatethenavigationsolution. Changes in plane parametersare also applied to periodically recalibrate the INSto reduce drift terms in inertial navigation outputsin order to improve the quality of the INS-basedplane prediction used by the feature matchingprocedure.
3D IMAGING TECHNOLOGIES
Various optical approaches exist to obtain 3Dimagery of the environment such as stereo-visioncamera systems, the combination of a digital cameraand projected light from a laser source, flash LADAR
systems, and systems based on a LADAR scanning inboth azimuth and elevation directions.Flash LADAR sensors consist of a modulated
laser emitter coupled with a focal plane array detectorand the required optics. Similar to a conventionalcamera this sensor creates an “image” of theenvironment, but instead of producing a 2D imagewhere each pixel has associated intensity values,the flash LADAR generates an image where eachpixel measurement consists of an associated range
and intensity value. Current low-cost flash LADARtechnology is capable of greater than 100£100pixel resolution with 5 mm depth resolution at a30 Hz frame rate. Example commercial products areproduced by MESA Imaging, Canesta, Inc., and PMDTechnologies GmbH.
CONCLUSIONS
This paper investigates the use of scanningLADARs for autonomous navigation in threedimensions. The navigation solution is based
on planar surfaces extracted from LADAR scanimages. The paper develops a method for estimatingplane parameters using images of a 2D scanningLADAR that is rotated in a limited elevation range
(three different elevation angles are implemented).Changes in plane parameters between scans areapplied to compute position and orientation changes.Least-squares linear position and attitude computationroutines are presented. The use of DOP factors
is introduced to formulate the influence of planargeometry on the navigation accuracy. Simulationresults and test results presented demonstrate thefeasibility of the 3D navigation methods developed.