Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Backpropagation Algorithm
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Backpropagation Algorithm
Basic Neuron Model In A Feedforward Network
Inputs xi arrive through pre-synaptic connections
Synaptic efficacy is modeled using real weights wi
The response of the neuron is a nonlinear function f of its weighted inputs
Differences In Networks
Feedforward Networks
Solutions are known
Weights are learned
Evolves in the weight space
Used for:
Prediction
Classification
Function approximation
Feedback Networks
Solutions are unknown
Weights are prescribed
Evolves in the state space
Used for:
Constraint satisfaction
Optimization
Feature matching
Inputs To Neurons
Arise from other neurons or from outside the network
Nodes whose inputs arise outside the network are called input nodes and simply copy values
An input may excite or inhibit the response of the neuron to which it is applied, depending upon the weight of the connection
Weights
Represent synaptic efficacy and may be excitatory or inhibitory
Normally, positive weights are considered as excitatory while negative weights are thought of as inhibitory
Learning is the process of modifying the weights in order to produce a network that performs some function
Output
The response function is normally nonlinear
Samples include
Sigmoid
Piecewise linear
Backpropagation Preparation
Training Set A collection of input-output patterns that are used to train the network
Testing Set A collection of input-output patterns that are used to assess network performance
Learning Rate-η A scalar parameter, analogous to step size in numerical integration, used to set the rate of adjustments
Network Error
Total-Sum-Squared-Error (TSSE)
Root-Mean-Squared-Error (RMSE)