03-01-2013, 12:58 PM
An Introduction to Statistical Signal Processing
Introduction
A random or stochastic process is a mathematical model for a phenomenon
that evolves in time in an unpredictable manner from the viewpoint of the
observer. The phenomenon may be a sequence of real-valued measurements
of voltage or temperature, a binary data stream from a computer, a modulated
binary data stream from a modem, a sequence of coin tosses, the
daily Dow–Jones average, radiometer data or photographs from deep space
probes, a sequence of images from a cable television, or any of an infinite
number of possible sequences, waveforms, or signals of any imaginable type.
It may be unpredictable because of such effects as interference or noise in a
communication link or storage medium, or it may be an information-bearing
signal, deterministic from the viewpoint of an observer at the transmitter
but random to an observer at the receiver.
The theory of random processes quantifies the above notions so that
one can construct mathematical models of real phenomena that are both
tractable and meaningful in the sense of yielding useful predictions of future
behavior. Tractability is required in order for the engineer (or anyone
else) to be able to perform analyses and syntheses of random processes, perhaps
with the aid of computers. The “meaningful” requirement is that the
models must provide a reasonably good approximation of the actual phenomena.
An oversimplified model may provide results and conclusions that
do not apply to the real phenomenon being modeled. An overcomplicated
one may constrain potential applications, render theory too difficult to be
useful, and strain available computational resources. Perhaps the most distinguishing
characteristic between an average engineer and an outstanding
engineer is the ability to derive effective models providing a good balance
between complexity and accuracy.
Organization of the book
Chapter 2 provides a careful development of the fundamental concept of
probability theory – a probability space or experiment. The notions of sample
space, event space, and probability measure are introduced and illustrated
by examples. Independence and elementary conditional probability
are developed in some detail. The ideas of signal processing and of random
variables are introduced briefly as functions or operations on the output of
an experiment. This in turn allows mention of the idea of expectation at an
early stage as a generalization of the description of probabilities by sums or
integrals.
Chapter 3 treats the theory of measurements made on experiments:
random variables, which are scalar-valued measurements; random vectors,
which are a vector or finite collection of measurements; and random processes,
which can be viewed as sequences or waveforms of measurements.
Random variables, vectors, and processes can all be viewed as forms of sigIntroduction
nal processing: each operates on “inputs,” which are the sample points of
a probability space, and produces an “output,” which is the resulting sample
value of the random variable, vector, or process. These output points
together constitute an output sample space, which inherits its own probability
measure from the structure of the measurement and the underlying
experiment. As a result, many of the basic properties of random variables,
vectors, and processes follow from those of probability spaces. Probability
distributions are introduced along with probability mass functions, probability
density functions, and cumulative distribution functions. The basic
derived distribution method is described and demonstrated by example. A
wide variety of examples of random variables, vectors, and processes are
treated. Expectations are introduced briefly as a means of characterizing
distributions and to provide some calculus practice.
Spinning pointers and flipping coins
Many of the basic ideas at the core of this text can be introduced and illustrated
by two very simple examples, the continuous experiment of spinning
a pointer inside a circle and the discrete experiment of flipping a coin.
Event spaces
Intuitively, an event space is a collection of subsets of the sample space or
groupings of elementary events which we shall consider as physical events
and to which we wish to assign probabilities. Mathematically, an event space
is a collection of subsets that is closed under certain set-theoretic operations;
that is, performing certain operations on events or members of the event
space must give other events. Thus, for example, if in the example of a
single voltage measurement we have
= ℜ and we are told that the set of
all voltages greater than 5 volts, {ω : ω ≥ 5}, is an event, that is, it is a
member of a sigma-field F of subsets of ℜ, then necessarily its complement
{ω : ω < 5} must also be an event, that is, a member of the sigma-field F.
If the latter set is not in F then F cannot be an event space! Observe that
no problem arises if the complement physically cannot happen – events that
“cannot occur” can be included in F and then assigned probability zero
when choosing the probability measure P. For example, even if you know
that the voltage does not exceed 5 volts, if you have chosen the real line
ℜ as your sample space, then you must include the set {r : r > 5} in the
event space if the set {r : r ≤ 5} is an event. The impossibility of a voltage
greater than 5 is then expressed by assigning P({r : r > 5}) = 0.