04-05-2012, 11:30 AM
Neural Networks And Their Statistical Application
Neural Networks And Their Statistical Application.ppt (Size: 926 KB / Downloads: 37)
Why Neural Networks are desirable
Human brain can generalize from abstract
Recognize patterns in the presence of noise
Recall memories
Make decisions for current problems based on prior experience
Why Desirable in Statistics
Prediction of future events based on past experience
Able to classify to nearest pattern in memory, doesn’t have to be exact
Predict latent variables that are not easily measured
Non-linear regression problems
What are Neural Networks?
The computational ability of a digital computer combined with the desirable functions of the human brain.
How the Process Works
Terminology, when to use neural networks and why they are used in statistical applications.
Terminology
Input: Explanatory variables also referred to as “predictors”.
Neuron: Individual units in the hidden layer(s) of a neural network.
Output: Response variables also called “predictions”.
Hidden Layers: Layers between input and output that an apply activation function.
Weights: Result (parameters) of an objective function (usually sum of squares error) used while training a network.
Backpropagation: Most popular training method for neural networks.
Network training: To find values of network parameters (weights) for performing a particular task.
Patterns: Set of predictors with their actual output used in training the network
When to use neural networks
Use for huge data sets (i.e. 50 predictors and 15,000 observations) with unknown distributions
Smaller data sets with outliers as neural networks are very resistant to outliers
Why Neural Networks in Statistics?
The methodology is seen as a new paradigm for data analysis where models are not explicitly stated but rather implicitly defined by the network.
Advanced pattern recognition capabilities
Allows for analysis where traditional methods might be extremely tedious or nearly impossible to interpret.
Feed Forward
Feed-forward method trained using backpropagation (backpropagation network) is used in time series prediction problems most often. It is the most commonly used algorithm.
We will see this algorithm in more detail soon
Adaline Network
Pattern recognition network
Essentially a single layer backpropagation network
Only recognizes exact training patterns
Hopfield Model
The Hopfield model is used as an auto-associative memory to store and recall a set of bitmap images.
Associative recall of images, given incomplete or corrupted version of a stored image the network can recall the original
Boltzmann Machine
The Boltzmann machine is a stochastic version of the Hopfield model.
Used for optimization problems such as the classic traveling salesman problem
Note
Those are only a few of the more common network structures. Advanced users can build networks designed for a particular problem in many software packages readily available on the market today.
Weights
Each connection (arrow) in the previous diagram has a weight, also called the synaptic weight
The function of these weights is to reduce error between desired output and actual output