09-11-2012, 05:14 PM
LEAST MEAN SQUARE ALGORITHM
LMS.pdf (Size: 461.4 KB / Downloads: 387)
Introduction
The Least Mean Square (LMS) algorithm, introduced by Widrow and Hoff in 1959 [12] is an adaptive algorithm, which uses a gradient-based method of steepest decent [10]. LMS algorithm uses the estimates of the gradient vector from the available data. LMS incorporates an iterative procedure that makes successive corrections to the weight vector in the direction of the negative of the gradient vector which eventually leads to the minimum mean square error. Compared to other algorithms LMS algorithm is relatively simple; it does not require correlation function calculation nor does it require matrix inversions.
Weighted Signal
The plot shown in figure 6.6 shows the tracking of the desired signal by using the LMS algorithm. At the beginning of the adaptation process there is a significant amount of error between the weighted signal y(t) and the desired signal. This is because the algorithm is initiated with an arbitrary value for weight, which is nowhere close to the optimum weight. However, as the adaptation process continues, based on the error computed at every iteration, the weight converges towards its optimum and the weighted signal y(t) follows the desired signal s(t) more closely with a small error.