Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: A Real-Time Vision System for Nighttime Vehicle Detection and Traffic Surveillance
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
A Real-Time Vision System for Nighttime Vehicle Detection and Traffic Surveillance

[attachment=33818]

Abstract

This paper presents an effective traffic surveillance
system for detecting and tracking moving vehicles in nighttime
traffic scenes. The proposed method identifies vehicles by detecting
and locating vehicle headlights and taillights using image segmentation
and pattern analysis techniques. First, a fast bright-object
segmentation process based on automatic multilevel histogram
thresholding is applied to effectively extract bright objects of interest.
This automatic multilevel thresholding approach provides
a robust and adaptable detection system that operates well under
various nighttime illumination conditions. The extracted bright
objects are then processed by a spatial clustering and tracking procedure
that locates and analyzes the spatial and temporal features
of vehicle light patterns, and identifies and classifies moving cars
and motorbikes in traffic scenes. The proposed real-time vision
system has also been implemented and evaluated on a TI DM642
DSP-based embedded platform. The system is set up on elevated
platforms to perform traffic surveillance on real highways and
urban roads. Experimental results demonstrate that the proposed
traffic surveillance approach is feasible and effective for vehicle
detection and identification in various nighttime environments.

INTRODUCTION

DETECTING and recognizing vehicles are an important
emerging research area for intelligent transportation systems.
Previous studies on this topic have discussed traffic surveillance
[1]–[21], driver assistance systems and autonomous
vehicle guidance [22]–[33], and road traffic information systems
[34]–[36]. In traffic surveillance applications, information
about moving vehicles may be obtained from loop detectors,
slit sensors, or cameras. Among the aforementioned sensors,
camera-based systems can provide much more traffic analysis
information, including traffic flow, vehicle classification, and
vehicle speed.

LIGHTING OBJECT EXTRACTION

The first step in detecting and extracting moving vehicles
from nighttime traffic scenes is to segment the salient objects
of moving vehicles from traffic image sequences. Fig. 1 shows
samples of typical nighttime traffic scenes from an urban road
and highway under different environmental illumination conditions.
These sample figures depict that, in typical nighttime
traffic scenes, there are moving cars and motorbikes on the
road, and under poorly or brightly environmental illuminated
conditions, vehicle lights are the only valid salient features.
In addition to the vehicle lights, some lamps, traffic lights,
and signs are also visible sources of illumination in the image
sequences of nighttime traffic scenes.

SPATIAL CLASSIFICATION PROCESS
OF LIGHTING OBJECTS


To extract/obtain potential vehicle light components from
the detection zone in the bright-object plane, a connectedcomponent
extraction process [39] can be performed to label
and locate the connected components of the bright objects.
Extracting the connected components reveals the meaningful
features of location, dimension, and pixel distribution associated
with each connected component. The location and dimension
of a connected component can be represented by the
bounding box surrounding it.

Motion-Based Grouping of Vehicle Components

With the tracks of potential vehicle components, the subsequent
motion-based grouping process groups potential vehicle
components belonging to the same vehicles. For this purpose,
potential vehicle components with rigidly similar motions in
successive frames are grouped into a single vehicle. Fig. 11
shows this concept.
The pairing tracks of nearby potential vehicle components
TPt
i and TPt
j are determined to belong to the same vehicle
if they continue to move coherently and reveal homogeneous
features for a period of time. The coherent motion of vehicle
components can be determined by the following coherent motion
conditions.

Tracking Process of Vehicle Component Groups

When a potential vehicle represented by a component group
is being tracked across the detection area, the segmentation
process and the motion-based grouping process can cause some
occlusion problems, such as follows: 1) Two vehicles that are
simultaneously moving parallel on the same lane are too close
to each other (particularly large vehicles, such as buses, vans, or
lorries, parallel moving with nearby motorbikes), and they may
be occluded for a while because this may not be completely
avoided in the spatial coherence criterion based on the lane
information during the motion-based grouping process, and
2) some large vehicles may have multiple light pairs and
therefore may not be immediately merged into single groups
during the motion-based grouping process. Therefore, using
the potential vehicle tracks of component groups TGt
k
∈ TGt
obtained by the motion-based grouping process, the component
group tracking process can update the position, motion, and
dimensions of each potential vehicle. This process progressively
refines the detection results of potential vehicles using
spatial–temporal information in sequential frames. This section
describes the tracking process for component groups of potential
vehicles, which handles the aforementioned occlusion
problems.

EXPERIMENTAL RESULTS

This section describes the implementation of the proposed
vehicle detection, tracking, and classification system on a DSPbased
real-time system. We conducted various representative
real-time experiments to evaluate the vehicle detection and
classification performance obtained by the proposed system.
The proposed real-time vision system was implemented on a TI
DM642 DSP-based embedded platform, operated at 600 MHz
with 32-MB DRAM, and set up on elevated platforms near
highways and urban roads.

CONCLUSION

This paper has proposed an effective nighttime vehicle detection
and tracking system for identifying and classifying
moving vehicles for traffic surveillance. The proposed approach
uses an efficient and fast bright-object segmentation process
based on automatic multilevel histogram thresholding to extract
bright objects from nighttime traffic image sequences. This
technique is robust and adaptable when dealing with varying
lighting conditions at night. A spatial analysis and clustering
procedure is applied to group lighting objects into groups of
vehicle lights for potential moving cars and motorbikes. Next, a
new effective feature-based vehicle tracking and identification
process analyzes the spatial and temporal information of these
potential vehicle light groups from consecutive frames.