1 / 26

Widrow-Hoff Learning (LMS Algorithm)

Widrow-Hoff Learning (LMS Algorithm). w. i. ,. 1. w. w. i. ,. 2. =. i. ¼. w. i. ,. R. ADALINE Network. Two-Input ADALINE. Mean Square Error. Training Set:. Input:. Target:. Notation:. Mean Square Error:. Error Analysis. The mean square error for the ADALINE Network is a

Download Presentation

Widrow-Hoff Learning (LMS Algorithm)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Widrow-Hoff Learning (LMS Algorithm)

  2. w i , 1 w w i , 2 = i ¼ w i , R ADALINE Network

  3. Two-Input ADALINE

  4. Mean Square Error Training Set: Input: Target: Notation: Mean Square Error:

  5. Error Analysis The mean square error for the ADALINE Network is a quadratic function:

  6. Stationary Point Hessian Matrix: The correlation matrix R must be at least positive semidefinite. If there are any zero eigenvalues, the performance index will either have a weak minumum or else no stationary point, otherwise there will be a unique global minimum x*. If R is positive definite:

  7. Approximate Steepest Descent Approximate mean square error (one sample): Approximate (stochastic) gradient:

  8. Approximate Gradient Calculation

  9. LMS Algorithm

  10. Multiple-Neuron Case Matrix Form:

  11. Analysis of Convergence For stability, the eigenvalues of this matrix must fall inside the unit circle.

  12. Since , . 1 – 2 a l > – 1 i Conditions for Stability (where li is an eigenvalue of R) Therefore the stability condition simplifies to

  13. Steady State Response If the system is stable, then a steady state condition will be reached. The solution to this equation is This is also the strong minimum of the performance index.

  14. Example Banana Apple

  15. Iteration One Banana

  16. Iteration Two Apple

  17. Iteration Three

  18. Adaptive Filtering Tapped Delay Line Adaptive Filter

  19. Example: Noise Cancellation

  20. Noise Cancellation Adaptive Filter

  21. Correlation Matrix

  22. 2 p k 3 p æ ö m ( k ) = 1.2 sin - - - - - - - - - – - - - - - - è ø 3 4 2 p 2 æ ö = ( 1.2 ) 0.5 cos - - - - - - = – 0.36 è ø 3 Signals

  23. E [ ( s ( k ) + m ( k ) ) v ( k ) ] h = E [ ( s ( k ) + m ( k ) ) v ( k – 1 ) ] Stationary Point 0 0

  24. Performance Index

  25. LMS Response

  26. Echo Cancellation

More Related