04-06-2012, 11:58 AM
Adaptive Filters
Adaptive Filters.pdf (Size: 24.33 KB / Downloads: 93)
Introduction
An adaptive filter is defined as a self-designing system that relies for its operation on a recursive algorithm,
which makes it possible for the filter to perform satisfactorily in an environment where knowledge of the
relevant statistics is not available.
Adaptive filters are classified into two main groups: linear, and non linear. Linear adaptive filters compute
an estimate of a desired response by using a linear combination of the available set of observables applied
to the input of the filter. Otherwise, the adaptive filter is said to be nonlinear. Adaptive filters may also be
classified into:
(i) Supervised adaptive filters, which require the availability of a training sequence that provides different
realizations of a desired response for a specified input signal vector. The desired response is compared
against the actual response of the filter due to the input signal vector, and the resulting error signal is
used to adjust the free parameters of the filter. The process of parameter adjustments is continued in a
step-by-step fashion until a steady-state condition is established.
(ii) Unsupervised adaptive filters, which performs adjustments of its free parameters without the need for
a desired response. For the filter to perform its function, its design includes a set of rules that enable it
to compute an input-output mapping with specific desirable properties. In the signal-processing
literature, unsupervised adaptive filtering is often referred to as blind deconvolution or blind
adaptation.
Least-Mean-Square (LMS) Algorithm
The LMS algorithm has established itself as the workhorse of adaptive signal processing for two primary
reasons:
• Simplicity of implementation and a computational efficiency that is linear in the number of adjustable
parameters.
• Robust performance
Hassibi et al. [6] have shown that a single realization of the LMS algorithm is optimal in the (i.e.,
minimax) sense. This result explains the robust behavior of the LMS algorithm.
Basically, the LMS algorithm is a stochastic gradient algorithm, which means that the gradient of the error
performance surface with respect to the free parameter vector changes randomly from one iteration to the
next. This stochasticity, combined with the presence of nonlinear feedback, is responsible for making a
detailed convergence analysis of the LMS algorithm a difficult mathematical task. Indeed, it has attracted
research attention for over 25 years [7-10].
Tracking of Time-Varying Systems
When an adaptive filter operates in a nonstationary environment the requirement is not only to converge to
the minimum point of the error performance surface but also to continually track the statistical variations
of the input signal. Tracking is a steady-state phenomenon, whereas convergence is a transient
phenomenon. This means that an adaptive filter must pass from the transient mode to the steady-state mode
before it can start tracking. Moreover, the rate of convergence and tracking are two distinct properties,
which means that an algorithm with good convergence properties does not necessarily possess a fast
tracking capability.
Neural Networks
A discussion of adaptive filters would be incomplete without a brief mention of (artificial) neural
networks, for one important class of which we offer the following definition:
A neural network is a massively parallel distributed processor made up of simple
processing units (known as neurons), which has a natural propensity for storing
experiential knowledge and making it available for use. It resembles the human brain in
two respects: