03-09-2016, 12:51 PM
Abstract : This paper describes the basic technological aspects of Digital Image
Processing with special reference to satellite image processing. Basically, all satellite
image-processing operations can be grouped into three categories: Image
Rectification and Restoration, Enhancement and Information Extraction. The former
deals with initial processing of raw image data to correct for geometric distortion,
to calibrate the data radiometrically and to eliminate noise present in the data.
The enhancement procedures are applied to image data in order to effectively display
the data for subsequent visual interpretation. It involves techniques for increasing
the visual distinction between features in a scene. The objective of the information
extraction operations is to replace visual analysis of the image data with quantitative
techniques for automating the identification of features in a scene. This involves
the analysis of multispectral image data and the application of statistically based
decision rules for determining the land cover identity of each pixel in an image.
The intent of classification process is to categorize all pixels in a digital image into
one of several land cover classes or themes. This classified data may be used to
produce thematic maps of the land cover present in an image
INTRODUCTION
Pictures are the most common and convenient means of conveying or
transmitting information. A picture is worth a thousand words. Pictures
concisely convey information about positions, sizes and inter-relationships
between objects. They portray spatial information that we can recognize as
objects. Human beings are good at deriving information from such images,
because of our innate visual and mental abilities. About 75% of the
information received by human is in pictorial form.
In the present context, the analysis of pictures that employ an overhead
perspective, including the radiation not visible to human eye are consideredThus our discussion will be focussing on analysis of remotely sensed images.
These images are represented in digital form. When represented as numbers,
brightness can be added, subtracted, multiplied, divided and, in general,
subjected to statistical manipulations that are not possible if an image is
presented only as a photograph. Although digital analysis of remotely sensed
data dates from the early days of remote sensing, the launch of the first Landsat
earth observation satellite in 1972 began an era of increasing interest in
machine processing (Cambell, 1996 and Jensen, 1996). Previously, digital
remote sensing data could be analyzed only at specialized remote sensing
laboratories. Specialized equipment and trained personnel necessary to conduct
routine machine analysis of data were not widely available, in part because of
limited availability of digital remote sensing data and a lack of appreciation
of their qualities.
DIGITAL IMAGE
A digital remotely sensed image is typically composed of picture elements
(pixels) located at the intersection of each row i and column j in each K bands
of imagery. Associated with each pixel is a number known as Digital Number
(DN) or Brightness Value (BV), that depicts the average radiance of a relatively
small area within a scene (Fig. 1). A smaller number indicates low average
radiance from the area and the high number is an indicator of high radiant
properties of the area.
The size of this area effects the reproduction of details within the scene.
As pixel size is reduced more scene detail is presented in digital representation.
COLOR COMPOSITES
While displaying the different bands of a multispectral data set, images
obtained in different bands are displayed in image planes (other than their
own) the color composite is regarded as False Color Composite (FCC). High
spectral resolution is important when producing color components. For a true
color composite an image data used in red, green and blue spectral region
must be assigned bits of red, green and blue image processor frame buffer
memory. A color infrared composite ‘standard false color composite’ is
displayed by placing the infrared, red, green in the red, green and blue frame
buffer memory (Fig. 2). In this healthy vegetation shows up in shades of red
because vegetation absorbs most of green and red energy but reflects
approximately half of incident Infrared energy. Urban areas reflect equal
portions of NIR, R & G, and therefore they appear as steel grey.
IMAGE RECTIFICATION AND REGISTRATION
Geometric distortions manifest themselves as errors in the position of a
pixel relative to other pixels in the scene and with respect to their absolute
position within some defined map projection. If left uncorrected, these
geometric distortions render any data extracted from the image useless. This
is particularly so if the information is to be compared to other data sets, be it
from another image or a GIS data set. Distortions occur for many reasons
For instance distortions occur due to changes in platform attitude (roll, pitch
and yaw), altitude, earth rotation, earth curvature, panoramic distortion and
detector delay. Most of these distortions can be modelled mathematically and
are removed before you buy an image. Changes in attitude however can be
difficult to account for mathematically and so a procedure called image
rectification is performed. Satellite systems are however geometrically quite
stable and geometric rectification is a simple procedure based on a mapping
transformation relating real ground coordinates, say in easting and northing,
to image line and pixel coordinates.
Rectification is a process of geometrically correcting an image so that it can
be represented on a planar surface , conform to other images or conform to a
map (Fig. 3). That is, it is the process by which geometry of an image is made
planimetric. It is necessary when accurate area, distance and direction
measurements are required to be made from the imagery. It is achieved by
transforming the data from one grid system into another grid system using a
geometric transformation.
Rectification is not necessary if there is no distortion in the image. For
example, if an image file is produced by scanning or digitizing a paper map
that is in the desired projection system, then that image is already planar and
does not require rectification unless there is some skew or rotation of the image.
Scanning and digitizing produce images that are planar, but do not contain
any map coordinate information. These images need only to be geo-referenced,
which is a much simpler process than rectification. In many cases, the image
header can simply be updated with new map coordinate information. This
involves redefining the map coordinate of the upper left corner of the image
and the cell size (the area represented by each pixel).
Ground Control Points (GCP) are the specific pixels in the input image
for which the output map coordinates are known. By using more points than
necessary to solve the transformation equations a least squares solution may
be found that minimises the sum of the squares of the errors. Care should be
exercised when selecting ground control points as their number, quality and
distribution affect the result of the rectification.
Once the mapping transformation has been determined a procedure called
resampling is employed. Resampling matches the coordinates of image pixels
to their real world coordinates and writes a new image on a pixel by pixel
basis. Since the grid of pixels in the source image rarely matches the grid for
IMAGE ENHANCEMENT TECHNIQUES
Image enhancement techniques improve the quality of an image as
perceived by a human. These techniques are most useful because many satellite
images when examined on a colour display give inadequate information for
image interpretation. There is no conscious effort to improve the fidelity of
the image with regard to some ideal form of the image. There exists a wide
variety of techniques for improving image quality. The contrast stretch, density
slicing, edge enhancement, and spatial filtering are the more commonly used
techniques. Image enhancement is attempted after the image is corrected for
geometric and radiometric distortions. Image enhancement methods are
applied separately to each band of a multispectral image. Digital techniques have been found to be most satisfactory than the photographic technique for
image enhancement, because of the precision and wide variety of digital
processes.
Contrast
Contrast generally refers to the difference in luminance or grey level values
in an image and is an important characteristic. It can be defined as the ratio
of the maximum intensity to the minimum intensity over an image.
Contrast ratio has a strong bearing on the resolving power and detectability
of an image. Larger this ratio, more easy it is to interpret the image. Satellite
images lack adequate contrast and require contrast improvement.
Contrast Enhancement
Contrast enhancement techniques expand the range of brightness values
in an image so that the image can be efficiently displayed in a manner desired
by the analyst. The density values in a scene are literally pulled farther apart,
that is, expanded over a greater range. The effect is to increase the visual
contrast between two areas of different uniform densities. This enables the
analyst to discriminate easily between areas initially having a small difference
in density.
Linear Contrast Stretch
This is the simplest contrast stretch algorithm. The grey values in the
original image and the modified image follow a linear relation in this
algorithm. A density number in the low range of the original histogram is
assigned to extremely black and a value at the high end is assigned to extremely
white. The remaining pixel values are distributed linearly between these
extremes. The features or details that were obscure on the original image will
be clear in the contrast stretched image. Linear contrast stretch operation can
be represented graphically as shown in Fig. 4. To provide optimal contrast
and colour variation in colour composites the small range of grey values in
each band is stretched to the full brightness range of the output or display
unit