1 / 51

ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker

ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 19: Numerical Compensations. Announcements. Homework 8 due next week. Make sure you spend time studying for the exam Exam 2 in one week (Thursday). Review on Tuesday.

Download Presentation

ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 19: Numerical Compensations

  2. Announcements • Homework 8 due next week. • Make sure you spend time studying for the exam • Exam 2 in one week (Thursday). • Review on Tuesday. • Exam 2 will cover: • Batch vs. CKF vs. EKF • Probability and statistics (good to keep this up!) • Haven’t settled on a question yet, but it will probably be a conditional probability question. I.e., what’s the probability of X given that Y occurs? • Observability • Numerical compensation techniques, such as the Joseph and Potter formulation. • No calculators should be necessary • Open Book, Open Notes

  3. Quiz 15 Review

  4. Quiz 15 Review Use Matlab and try it out

  5. Quiz 15 Review This is TRUE for the Batch, but you may run into a problem with the Kalman filters.

  6. Quiz 15 Review The best scan of the book ever:

  7. Quiz 15 Review

  8. Quiz 15 Review

  9. Quiz 15 Review

  10. Quiz 15 Review

  11. Quiz 15 Review

  12. Quiz 15 Review

  13. HW#8 • Due in 7 days

  14. HW#8

  15. HW#8 Solutions to (2) Note that these plots aren’t 100% well-labeled!

  16. HW#8

  17. HW#8 • The biggest pitfall • When processing the 2nd observation, set that is, use the most current covariance you have as the a priori! Not the original one.

  18. Previous Lecture • Processing an observation vector one element at a time. • Whitening • Cholesky • Joseph • Today • Positive Definiteness • Conditioning Number • Potter • Householder

  19. Positive Definite Matrices • Definition

  20. Positive Definite Matrices • Properties of PD matrices

  21. Positive Definite Matrices • Properties of PD matrices

  22. Quick Break

  23. Example Illustrating Numerical Instability of Sequential (Kalman) Filter (see 4.7.1) Summary of Results Conventional Kalman Exact to order Batch Joseph

  24. Conditioning Number • Conditioning Number of matrix A: C(A) • where the gammas are the min/max eigenvalue of the matrix. • Inverting A with p digits of precision becomes error-prone as C(A) 10p. • If we invert the square root of A: • then C(W)=sqrt(C(A)). • Numerical difficulties will arise as C(W) 102p.

  25. Square Root Filter Algorithms

  26. Square Root Filter Algorithms • Motivation: • Loss of significant digits that occurs in computing the measurement update of the state error covariance matrix (P) at the observation epoch (Kaminski et al., 1971) • If eigenvalues spread a wide range, then the numerical errors can destroy the symmetry and PD-ness of the P matrix. Filter divergence can occur.

  27. Square Root Filter Algorithms • Define W, the square root of P: • Observe that if we have W, then computing P in this manner will always result in a symmetric PD matrix. • Note: Square root filters are typically derived to process one observation at a time. Hence,

  28. (Sorry for just scanning the text, but it’s a pretty concise description!)

  29. (Sorry for just scanning the text, but it’s a pretty concise description!)

  30. This is a key

  31. Potter Square Root Filter

  32. Potter Square Root Filter

  33. Potter Square Root Filter

  34. Potter Square Root Filter

  35. Potter Square Root Filter

  36. Potter Square Root Filter

  37. Potter Square Root Filter

  38. Potter Square Root Filter (say, using Cholesky)

  39. Potter Square Root Filter

  40. Potter Square Root Filter

  41. Potter Square Root Filter

  42. Potter Square Root Filter

  43. Potter Square Root Filter

  44. Potter Square Root Filter

  45. Potter Square Root Filter Note: if you are given an a priori P matrix, convert it to W using Cholesky or equivalent at start.

  46. Potter Square Root Filter

  47. Potter Square Root Filter And page 339 of Stat OD text.

  48. Numerical Instability of Kalman Filter Summary of Results Potter Algorithm Exact to order Conventional Kalman Joseph Batch

More Related