1 / 33

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 36: Smoothing and State Accuracy Estimation. Announcements. Homework 11 due on Friday Sample solutions posted online Lecture quiz due by 5pm on Wednesday

bond
Download Presentation

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 36: Smoothing and State Accuracy Estimation

  2. Announcements • Homework 11 due on Friday • Sample solutions posted online • Lecture quiz due by 5pm on Wednesday • Final Exam Posted On Friday • Due December 16 by noon • By 11:59pm for CAETE Students • Final Project Due December 16 by noon • By 11:59pm for CAETE Students

  3. Fixed Interval Smoothing

  4. Motivation • The batch processor provides an estimate based on a full span of data • When including process noise, we lose this equivalence between the batch and any of the sequential processors • Is there some way to update the estimated state using information gained from future observations?

  5. Smoothing • Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch. • Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.). • Step 2. Start with the last observation processed and smooth back through the observations.

  6. Notation • As presented in the book, the most common source of confusion for the smoothing algorithm is the notation Based on observations up to and including Value/vector/matrix Time of current estimate

  7. Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):

  8. Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):

  9. Smoothing visualization • Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does. • Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.

  10. Smoothing • Caveats: • If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby. • While this is good, it also means smoothing doesn’t always have a big effect. • Smoothing shouldn’t remove the white noise found on the signals. • It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.

  11. Smoothing of State Estimate • First, we use • If Q = 0,

  12. Smoothing of State Estimate • Hence, in the CKF, we store:

  13. Smoothing of Covariance • Optionally, we may smooth the state error covariance matrix

  14. Smoothing Algorithm

  15. Smoothing Algorithm

  16. Smoothing • If we suppose that there is no process noise (Q=0), then the smoothing algorithm reduces to the CKF mapping relationships:

  17. An example: 4-41 and 4-42 Book p. 283

  18. An example: 4-41 and 4-42 Book p. 284

  19. Smoothing • Say there are 100 observations • We want to construct new estimates using all data, i.e.,

  20. Smoothing • Say there are 100 observations

  21. Smoothing • Say there are 100 observations

  22. Smoothing • Say there are 100 observations

  23. Solution Characterization

  24. Factors Influencing Filter Accuracy • Truncation error (linearization) • Round-off error (fixed precision arithmetic) • Mathematical model simplifications (dynamics and measurement model) • Errors in input parameters (e.g., J2) • Amount, type, and accuracy of tracking data

  25. How do we characterize our accuracy? • For the Jason-2 / OSTM mission, the OD fits are quoted to have errors less than centimeter (in radial) • How do they get an approximation accuracy? • Residuals? • Depends on how much we trust the data • Provides information on fit to data, but solution accuracy? • Covariance Matrix? • How realistic is the output covariance matrix? • (Actually, I can make the output matrix whatever I want through process noise or other means.)

  26. Preliminary Discussion – Batch Processor Covariance • Qualitatively, how does the mapped covariance look for the Batch processor?

  27. Solution Characterization • Characterization requires a comparison to an independent solution • Different solution methods, models, etc. • Different observations data sets: • Global Navigation Satellite Systems (GNSS) (e.g., GPS) • Doppler Orbitography and Radio-positioning Integrated by Satellite (DORIS) • Satellite Laser Ranging (SLR) • Deep Space Network (DSN) • Delta-DOR • Others…

  28. Compare to Independent Solution • Jason-2 / OSTM positions solutions generated by/at: • JPL – GPS only • GSFC – SLR, DORIS, and GPS • CNES – SLR, DORIS, and GPS • Algorithms/tools differ by team: • Different filters • Different dynamic/stochastic models

  29. Comparison of Jason-2 / OSTM Solutions Image: Bertiger, et al., 2010 • 1 Cycle = approximately 10 days • Differences on the order of millimeters

  30. Orbit Overlap Studies • Compare different fit intervals:

  31. Orbit Overlap Studies • Consider the “abutment test”:

  32. Example: Jason-2 / OSTM • Each data fit at JPL uses 30 hrs of data, centered at noon • This means that each data fit overlaps with the previous/next fit by six hours • Compare the solutions over the middle four hours • Why?

  33. Example: Jason-2 / OSTM Image: Bertiger, et al., 2010 • Histogram of daily overlaps for almost one year • Imply solution consistency of ~1.7 mm • This an example of why it is called “precise orbit determination” instead of “accurate orbit determination”

More Related