Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: PARAMETRIC METHODS SEMINAR TOPIC
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
PARAMETRIC METHODS

[attachment=67385]

General consideration of parametric model spectrum estimation:

Autoregressive spectrum estimation:

A. The autocorrelation method
B. The covariance method
C. Modified covariance method
D. Burg algorithm:
E. Selection of the model order:

Moving average spectrum estimation:

Autoregressive moving average spectrum estimation:

Minimum Variance Spectrum Estimation (a nonparametric method):

A. The relation of the variance with the power spectrum
B. FIR bandpass filter bank and the variances of the filters’ outputs:
C. The FIR bandpass filters with minimum variance
D. The minimum variance spectral estimate

Maximum entropy method (an AR parametric method):

A. The concept of entropy
B. Extrapolation of the autocorrelation
C. The maximum entropy spectral estimate


PARAMETRIC METHODS FOR POWER SPECTRUM ESTIMATION

Parametric methods for power spectrum estimation are based on parametric models, and they include those
of the autoregressive (AR) spectral estimation, the moving average (MA) spectral estimation, and the
autoregressive moving average (ARMA) spectral estimation, which are, respectively, based on the AR, MA,
and ARMA models. The maximum entropy method is of the same form of the AR spectral estimation.
In parametric methods, a parametric model for a random process is first selected and then the model
parameters are determined.
The parametric spectral estimators are less biased and have a lower variance than the nonparametric
spectral estimators. With parametric methods it is possible to significantly improve the resolution of the
spectrum estimation unless the model used is consistent with the random process being analyzed. Otherwise
inaccurate or misleading spectrum estimates may result.

General consideration of parametric model spectrum estimation:

As we have learnt in the previous lectures concerning Signal Modeling, a random process can be
modeled with an ARMA model, or an AR model, or an MA model. The AR and MA models are the special
cases of the ARMA model. Supposing that a random process x(n) is modeled as an ARMA(p, q) process
with an ARMA(p, q) model, then the system function of the model is


=

=

+
= p
k
jk
p
q
k
jk
q
j
a k e
b k e
H e
1
0
1 ( )
( )
( )
ω
ω
ω
(43)
In this case, the power spectrum of the process x(n) can be computed in the following manner,
2
1
2
0
1 ( )
( )
( )


=

=

+
=
p
k
jk
p
q
k
jk
q
j
x
a k e
b k e
P e
ω
ω
ω
(44)
Alternatively, if the autocorrelation r (k)
x
is given, the power spectrum can be obtained from the Fourier
transform of r (k)
x
,


=−∞

=
k
jk
x
j
x P e r k e
ω ω
( ) ( ) . (45)
Eqs. (44) and (45) demonstrate two approaches to computing the power spectrum of an ARMA process, and
also reveal two equivalent representations of an ARMA random process, that is, the process can be
represented equivalently either by a finite sequence of model parameters a (k)
p
and b (k)
q
, or by an
autocorrelation sequence r (k)
x
. The equivalence of the two representations is because the autocorrelation
and the model parameters are related with the Yule-Walker equations,
 
= =
+ − = −
q
l k
v q
p
l
x p x
r (k) a l)( r (k l) b l)( h (l k)
2 *
1
σ . (46)
In practice, a random process x(n) is often given only over a finite interval, 0 ≤ n ≤ N −1, and in this case
the autocorrelation of x(n) must be estimated in a finite sum as follows,

− −
=
= +
N k
n
x
x n k x n
N
r k
1
0
*
( ) ( )
1
ˆ ( ) , k=0, 1, …, N–1. (47) 18
When the ARMA model in Eq. (43) is selected for modeling process x(n), the model parameters in this case
are determined from this estimated autocorrelation sequence rˆ (k)
x
, and they are different from a (k)
p
and
b (k)
q
determined from r (k)
x
since rˆ (k)
x
is, in general, not equal to r (k)
x
. Such model parameters that are
determined from rˆ (k)
x
are denoted by aˆ (k)
p
and ( )
ˆb k q
, which give an estimate of the power spectrum,
2
1
2
0
1 ˆ ( )
( )
ˆ
( )
ˆ


=

=

+
=
p
k
jk
p
q
k
jk
q
j
x
a k e
b k e
P e
ω
ω
ω
(48)
Eq. (48) is a general case of the parametric spectral estimation methods. In this case, all we need to do for
estimating the power spectrum is to find aˆ (k)
p
and ( )
ˆb k q
. When aˆ (k)
p
and ( )
ˆb k q
are determined,
( )
ˆ

x P e is found.
Among these parametric spectral estimations, the AR estimation is the most popular. This is because the
AR parameters can be found by solving a set of linear equations. Whereas, for the ARMA and MA
parameters, a set of nonlinear equations need to be solved, which will be much more difficult.

Autoregressive spectrum estimation:

The autoregressive spectrum estimation is based on the AR model. In this case, a random process x(n) is
modeled as an AR(p) process. If the autocorrelation r (k)
x
of a random process x(n) is given, the AR
parameters, a (k)
p
and b )0( , can be determined from r (k)
x
using the AR model. Then the power spectrum
of the AR process is
2
1
2
1 ( )
)0(
( )
=

+
=
p
k
jk
p
j
x
a k e
b
P e
ω
ω
(49)
If a random process x(n) is given over a finite interval 0 ≤ n ≤ N −1, the autocorrelation of x(n) must be
estimated, and it is denoted rˆ (k)
x
. The AR parameters that are determined from the estimated
autocorrelation rˆ (k)
x
are defined as aˆ (k)
p
and )0(
ˆb . The power spectrum that is estimated based on x(n)

A. The autocorrelation method

The AR parameters aˆ (k)
p
are found by solving the autocorrelation normal equation Note that Eq. (51) is the same in form as the modified Yule-Walker equations, but the autocorrelation
values in Eq. (51) are the estimated ones, rˆ (k)
x
, from a finite data record, i.e., x(n) for 0 ≤ n ≤ N −1. The
autocorrelation estimate rˆ (k)
x
is biased. This method that estimates the power spectrum using the
autocorrelation method is also referred to as the Yule-Walker method (YWM).

B. The covariance method

which is different from the autocorrelation method in that no windowing of the data is required since the
values of x(n) used for finding rˆ (k,l)
x
in Eq. (55) are all in the interval 0 ≤ n ≤ N −1 and thus no zero

padding is needed. This means that there is no windowing effect in the variance method. Therefore, for short
data records the variance method generally gives higher resolution spectrum estimates than the
autocorrelation method.
C. Modified covariance method
In the modified covariance method the AR parameters aˆ (k)
p
are also determined by solving the normal
equations in Eq. (54)

Example 5. Estimation of the power spectrum of an AR(4) process.

Consider the AR(4) process generated by the difference equation
x(n) = .2 7377x(n − )1 − .3 7476x(n − )2 + .2 6293x(n − )3 − .0 9224x(n − )1 + w(n)
where w(n) is unit variance white Gaussian noise. The filter generating x(n) has a pair of poles at
2.0 π
98.0
j
z e
±
= and a pair of poles at 3.0 π
98.0
j
z e
±
= . Using the data records of length N = 128, en ensemble
of 50 spectrum estimates were calculated using the Yule-Walker method, the covariance method, and the
modified covariance method, and the Burg’s method. The overlay plots of the 50 estimates from the four
methods are shown in part (a) in 8.25 to 8.28, and the ensemble average of the 50 estimates and the true
power spectrum are shown in part (b) in 8.25 to 8.28.

Minimum Variance Spectrum Estimation (a nonparametric method):

In the minimum variance (MV) method the power spectrum is estimated by filtering a random process with a
bank of narrowband bandpass filters. The bandpass filters are designed to be optimum by minimizing the
variance of the output of a narrowband filter that adapts to the spectral content of the input process at each
frequency of interest.
A. The relation of the variance with the power spectrum
Consider a zero mean WSS process y(n). The variance of y(n) is
{ } 2 2
(n) E y(n) σ y = , (70)
which is the power of the process y(n). For a given autocorrelation r (k)
y
we have

B. FIR bandpass filter bank and the variances of the filters’ outputs:

Consider a bank of FIR bandpass filters (Fig. 3), all having order p and the frequency responses (or the
system function) of the following form,
The input to the filters is x(n), and the outputs of the bandpass filters are y (n)
i
for i=0, 1, …, L. To use such
a filter bank to estimate the power spectrum of x(n) with a finite-length data record, we should constrain all
bandpass filters that, at their center frequencies ωi
, have a unit gain,

C. The FIR bandpass filters with minimum variance

Designing a filter is just determining the filter coefficients based on a certain criterion. The criterion that we
use here is the minimum variance of y (n)
i
, which is obtained by minimizing 2
i
σ y
in Eq. (81) under the
constraint given by Eq. (78). The approach to this constrained minimization problem is given in Section
2.3.10 in the Hayes' textbook.

D. The minimum variance spectral estimate

To find the power spectrum estimate, let us look at the bandpass filter bank again. Since the bandpass filters
are narrowband and the bandwidth of the ith filter ( )

i G e is assumed to be ∆ , then in the bandwidth ∆ ,
that is, − ∆ 2 ≤ ≤ + ∆ 2 ωi ω ωi
, we may assume ( ) ≈1

i G e (due to the given constraint in Eq. (75)), and
out of the bandwidth ∆ , ( ) ≈ 0

i G e . In this case, the relation of the variance of y (n)
i
with the power
spectrum of x(n) in Eq. (73) becomes

Maximum entropy method (an AR method):

The maximum entropy spectral estimation is established based on an explicit extrapolation of a finite length
sequence of a known autocorrelation of a random process x(n). The extrapolation should be chosen so that
the random process characterized by the extrapolated autocorrelation sequence has maximum entropy. The
random process treated here is assumed to be Gaussian so that the concerned problem becomes solvable.

A. The concept of entropy

Entropy is a measure of randomness or uncertainty. For a Gaussian random process x(n) with power
spectrum ( )

x P e , the entropy of the random variable x(n) is expressed by

B. Extrapolation of the autocorrelation

Given the autocorrelation r (k)
x
of a WSS process for | k |≤ p , we want to extrapolate r (k)
x
for | k |> p .
Supposing that the extrapolated autocorrelation is r (k)
e
, the power spectrum of x(n) can be written as
 
>

=−

= +
k p
jk
e
p
k p
jk
x
j
x P e r k e r k e
|
( ) ( ) ( )
ω ω ω
(98)
Now the question is how or what criterion should be used to determine the extrapolated autocorrelation. As
the name of the method indicates, the maximum entropy is the criterion for performing the extrapolation. A
maximum entropy extrapolation is equivalent to finding the sequence of the extrapolated autocorrelations
that make x(n) as white (random) as possible. From the power spectrum point of view, this maximum
entropy extrapolation makes the power spectrum as flat as possible.

C. The maximum entropy spectral estimate

If a random process x(n) is assumed to be a Gaussian process with a given segment of the autocorrelation
r (k)
x
for | k |≤ p , then the extrapolated autocorrelation r (k)
e
that maximizes the entropy in Eq. (97) can be
found by setting ( ) ( )
* H x r k e
∂ ∂ =0, specifically

Summary of spectral estimation methods:

• Power spectrum of a WSS process x(n): 

=−∞

=
k
jk
x
j
x P e r k e
ω ω
( ) ( ) , (note − ∞ < k < ∞ ) which is the Fourier
transform of the autocorrelation sequence r (k)
x
.
• ( )

x P e can only be estimated when the data available for a random process x(n) are of finite-length or
the data are contaminated with noise.
• Estimating the power spectrum is equivalent to estimating the autocorrelation.
• Two classes of methods for power spectrum estimation: nonparametric methods and parametric methods
In each class there are a set of methods.