1 / 19

ADAPTIVE FILTERS I

ADAPTIVE FILTERS I. Searching the Performance Surface CHAPTER # 4. Instructor :Dr. Aamer Iqbal Bhatti. Introduction. Performance surface for the Adaptive Linear Combiner is quadratic for stationary signals

Download Presentation

ADAPTIVE FILTERS I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ADAPTIVE FILTERS I Searching the Performance Surface CHAPTER # 4 Instructor :Dr. Aamer Iqbal Bhatti

  2. Introduction • Performance surface for the Adaptive Linear Combiner is quadratic for stationary signals • In most of the applications parameters for the error surface are not known and we have to estimate them • Problem is to devise algorithms that search the estimated performance surface.

  3. Methods of Searching the Performance Surface • Two descent algorithms for search of the optimal weight solution shall be followed a) Newton’s Method b) Method of Steepest Descent • Newton’s method is fundamental root finding procedure in mathematics • It is comparatively difficult to implement • Weights are varied in each iteration

  4. Methods of Searching the Performance Surface • Changes in weight are always in the direction of the minimum of the performance surface • Method of Steepest Decent is easy to implement • In each iteration, weights are varied towards negative gradient of performance surface

  5. Basic Ideas of Gradient Search Method • We consider the simplest case where there is only one weight to be adjusted • One weight surface is a parabola

  6. Basic Ideas of Gradient Search Method • Performance surface for single weight is represented by Problem is to find weight adjustment that causes the mean square error to be minimized

  7. Methods of Searching the Performance Surface • Iterative method to find the minimum value for the weight variable • Start with an initial guess • Measure the slope of the performance surface at this point • Choose a new value equal to the initial guess plus an increment proportional to negative of the slope

  8. Methods of Searching the Performance Surface • Another new value is obtained and the above procedure is repeated until the minimum is achieved • The values obtained by measuring the slope at discrete time points is called the “Gradient Estimate” • Use of the negative of the gradient is necessary to proceed “Down Hill”

  9. Gradient Search Algorithm and Solution • For single variable the repetitive gradient search procedure can be represented as

  10. Gradient Search Algorithm and Solution • Gradient for the single weight is given by Transients and rate of convergence can be analyzed through substitution of the gradient

  11. Gradient Search Algorithm and Solution • Above equation is a constant coefficient linear difference equation • It can be solved through induction from the first few iterations

  12. Gradient Search Algorithm and Solution • Generalized results for the kth iteration The result give the weight variable at any point in the search procedure and thus a solution to the gradient search algorithm

  13. Stability and Rate of Convergence • In the solution for the gradient search algorithm equation geometric ratio is given by Search equation is stable if and only if If the condition is met than the algorithm is seen to converge to the optimum solution

  14. Stability and Rate of Convergence The figure depicts the gradient search for different values of r

  15. Stability and Rate of Convergence The effect of the choice of u on r and on the weight iterative procedure are summarized in the following table

  16. The Learning Curve • Effect of variations in the adjustment of the weight on mean square error can be observed from For the continuous update of the weights the mean square error becomes

  17. The Learning Curve • Mean square error under goes a geometric progression • The geometric ratio of the progression is given by This ratio can never be negative , the mean square progression shall never be oscillatory

  18. The learning Curve • For the single weight system the figure shows the relaxation of the mean square error from its initial value towards the optimal value • For the learning curve for the value of r=0.5

  19. The Learning Curve

More Related