ASEN 5070: Statistical
Download
1 / 33

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones - PowerPoint PPT Presentation


  • 112 Views
  • Uploaded on

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 36: Smoothing and State Accuracy Estimation. Announcements. Homework 11 due on Friday Sample solutions posted online Lecture quiz due by 5pm on Wednesday

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones' - bond


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

ASEN 5070: Statistical Orbit Determination I

Fall 2013

Professor Brandon A. Jones

Professor George H. Born

Lecture 36: Smoothing and State

Accuracy Estimation


Announcements
Announcements

  • Homework 11 due on Friday

    • Sample solutions posted online

  • Lecture quiz due by 5pm on Wednesday

  • Final Exam Posted On Friday

    • Due December 16 by noon

      • By 11:59pm for CAETE Students

  • Final Project Due December 16 by noon

    • By 11:59pm for CAETE Students



Motivation
Motivation

  • The batch processor provides an estimate based on a full span of data

  • When including process noise, we lose this equivalence between the batch and any of the sequential processors

  • Is there some way to update the estimated state using information gained from future observations?


Smoothing
Smoothing

  • Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch.

  • Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.).

  • Step 2. Start with the last observation processed and smooth back through the observations.


Notation
Notation

  • As presented in the book, the most common source of confusion for the smoothing algorithm is the notation

Based on observations up to and including

Value/vector/matrix

Time of current estimate


Smoothing visualization
Smoothing visualization

  • Process observations forward in time:

  • If you were to process them backward in time (given everything needed to do that):


Smoothing visualization1
Smoothing visualization

  • Process observations forward in time:

  • If you were to process them backward in time (given everything needed to do that):


Smoothing visualization2
Smoothing visualization

  • Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does.

  • Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.


Smoothing1
Smoothing

  • Caveats:

    • If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby.

    • While this is good, it also means smoothing doesn’t always have a big effect.

  • Smoothing shouldn’t remove the white noise found on the signals.

    • It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.


Smoothing of state estimate
Smoothing of State Estimate

  • First, we use

  • If Q = 0,


Smoothing of state estimate1
Smoothing of State Estimate

  • Hence, in the CKF, we store:


Smoothing of covariance
Smoothing of Covariance

  • Optionally, we may smooth the state error covariance matrix




Smoothing2
Smoothing

  • If we suppose that there is no process noise (Q=0), then the smoothing algorithm reduces to the CKF mapping relationships:




Smoothing3
Smoothing

  • Say there are 100 observations

  • We want to construct new estimates using all data, i.e.,


Smoothing4
Smoothing

  • Say there are 100 observations


Smoothing5
Smoothing

  • Say there are 100 observations


Smoothing6
Smoothing

  • Say there are 100 observations



Factors influencing filter accuracy
Factors Influencing Filter Accuracy

  • Truncation error (linearization)

  • Round-off error (fixed precision arithmetic)

  • Mathematical model simplifications (dynamics and measurement model)

  • Errors in input parameters (e.g., J2)

  • Amount, type, and accuracy of tracking data


How do we characterize our accuracy
How do we characterize our accuracy?

  • For the Jason-2 / OSTM mission, the OD fits are quoted to have errors less than centimeter (in radial)

    • How do they get an approximation accuracy?

    • Residuals?

      • Depends on how much we trust the data

      • Provides information on fit to data, but solution accuracy?

    • Covariance Matrix?

      • How realistic is the output covariance matrix?

      • (Actually, I can make the output matrix whatever I want through process noise or other means.)


Preliminary discussion batch processor covariance
Preliminary Discussion – Batch Processor Covariance

  • Qualitatively, how does the mapped covariance look for the Batch processor?


Solution characterization
Solution Characterization

  • Characterization requires a comparison to an independent solution

    • Different solution methods, models, etc.

    • Different observations data sets:

      • Global Navigation Satellite Systems (GNSS) (e.g., GPS)

      • Doppler Orbitography and Radio-positioning Integrated by Satellite (DORIS)

      • Satellite Laser Ranging (SLR)

      • Deep Space Network (DSN)

      • Delta-DOR

      • Others…


Compare to independent solution
Compare to Independent Solution

  • Jason-2 / OSTM positions solutions generated by/at:

    • JPL – GPS only

    • GSFC – SLR, DORIS, and GPS

    • CNES – SLR, DORIS, and GPS

  • Algorithms/tools differ by team:

    • Different filters

    • Different dynamic/stochastic models


Comparison of jason 2 ostm solutions
Comparison of Jason-2 / OSTM Solutions

Image: Bertiger, et al., 2010

  • 1 Cycle = approximately 10 days

  • Differences on the order of millimeters


Orbit overlap studies
Orbit Overlap Studies

  • Compare different fit intervals:


Orbit overlap studies1
Orbit Overlap Studies

  • Consider the “abutment test”:


Example jason 2 ostm
Example: Jason-2 / OSTM

  • Each data fit at JPL uses 30 hrs of data, centered at noon

  • This means that each data fit overlaps with the previous/next fit by six hours

  • Compare the solutions over the middle four hours

    • Why?


Example jason 2 ostm1
Example: Jason-2 / OSTM

Image: Bertiger, et al., 2010

  • Histogram of daily overlaps for almost one year

  • Imply solution consistency of ~1.7 mm

  • This an example of why it is called “precise orbit determination” instead of “accurate orbit determination”


ad