1 / 10

Derivation of Recursive Least Squares Given that is the collection

Derivation of Recursive Least Squares Given that is the collection Thus the least squares solution is

Download Presentation

Derivation of Recursive Least Squares Given that is the collection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Derivation of Recursive Least Squares Given that is the collection Thus the least squares solution is Now what happens when we increase n by 1, when a new data point comes in, we need to re-estimate this requires repetitions calculations and recalculating the inverse (expensive in computer time and storage) CY3A2 System identification

  2. Lets look at the expression and and define CY3A2 System identification

  3. (1) (2) The least squares estimate at data n (3) (4) ( Substitute (4) into (3) ) ( Applying (1) ) CY3A2 System identification

  4. RLS Equations are But we still require a matrix inverse to be calculated in (8) Matrix Inversion Lemma If A, C, BCD are nonsigular square matrix ( the inverse exists) then CY3A2 System identification

  5. The best way to prove this is to multiply both sides by [A+BCD] Now, in (8), identify A, B,C,D CY3A2 System identification

  6. Matrix inversion lemma is very important in convert LS into RLS. To prove the above, CY3A2 System identification

  7. RLS equations are In practice, this recursive formula can be initiated by setting to a large diagonal matrix, and by letting be your best first guess. CY3A2 System identification

  8. RLS with forgetting We would like to modify the recursive least squares algorithm so that older data has less effect on the coefficient estimation. This could be done by biasing the objective function that we are trying to minimise (i.e. the squared error) This same weighting function when used on an ARMAX model can be used to bias the calculation of the Pn matrix giving more recent values greater prominence, as follows. where λ is chosen to be between 0 and 1. CY3A2 System identification

  9. When λ is 1 all time steps are of equal importance but as λ smaller less emphasis is given to older values. We can use this expression to derive a recursive form of weighted The Matrix inversion lemma will then give a method of calculating given to get CY3A2 System identification

  10. RLS Algorithm with forgetting factor: CY3A2 System identification

More Related