17-06-2013, 01:01 PM
Statistical Signal Processing
Statistical Signal.pdf (Size: 2.35 MB / Downloads: 31)
Introduction
MANY signals have a stochastic structure or at least some stochastic component. Some of these signals are
a nuisance: noise gets in the way of receiving weak communication signals sent from deep space probes
and interference from other wireless calls disturbs cellular telephone systems. Many signals of interest are
also stochastic or modeled as such. Compression theory rests on a probabilistic model for every compressed
signal. Measurements of physical phenomena, like earthquakes, are stochastic. Statistical signal processing
algorithms work to extract the good despite the “efforts” of the bad.
This course covers the two basic approaches to statistical signal processing: estimation and detection. In
estimation, we want to determine a signal’s waveform or some signal aspect(s). Typically the parameter or
signal we want is buried in noise. Estimation theory shows how to find the best possible optimal approach
for extracting the information we seek. For example, designing the best filter for removing interference
from cell phone calls amounts to a signal waveform estimation algorithm. Determining the delay of a radar
signal amounts to a parameter estimation problem. The intent of detection theory is to provide rational
(instead of arbitrary) techniques for determining which of several conceptions—models—of data generation
and measurement is most “consistent” with a given set of data. In digital communication, the received signal
must be processed to determine whether it represented a binary “0” or “1”; in radar or sonar, the presence
or absence of a target must be determined from measurements of propagating fields; in seismic problems,
the presence of oil deposits must be inferred from measurements of sound propagation in the earth. Using
detection theory, we will derive signal processing algorithms which will give good answers to questions such
as these when the information-bearing signals are corrupted by superfluous signals (noise).
Probability and Stochastic
Processes
Foundations of Probability Theory
Basic Definitions
The basis of probability theory is a set of events—sample space—and a systematic set of numbers—
probabilities—assigned to each event. The key aspect of the theory is the system of assigning probabilities.
Formally, a sample space is the set W of all possible outcomes wi of an experiment. An event is a collection
of sample points wi determined by some set-algebraic rules governed by the laws of Boolean algebra.
Random Variables and Probability Density Functions
A random variable X is the assignment of a number—real or complex—to each sample point in sample space;
mathematically, X : W 7! R. Thus, a random variable can be considered a function whose domain is a set and
whose range are, most commonly, a subset of the real line. This range could be discrete-valued (especially
when the domain W is discrete). In this case, the random variable is said to be symbolic-valued. In some
cases, the symbols can be related to the integers, and then the values of the random variable can be ordered.
When the range is continuous, an interval on the real-line say, we have a continuous-valued random variable.
In some cases, the random variable is a mixed random variable: it is both discrete- and continuous-valued.