slide1
Download
Skip this Video
Download Presentation
ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones

Loading in 2 Seconds...

play fullscreen
1 / 33

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones - PowerPoint PPT Presentation


  • 112 Views
  • Uploaded on

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 36: Smoothing and State Accuracy Estimation. Announcements. Homework 11 due on Friday Sample solutions posted online Lecture quiz due by 5pm on Wednesday

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones' - bond


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

ASEN 5070: Statistical Orbit Determination I

Fall 2013

Professor Brandon A. Jones

Professor George H. Born

Lecture 36: Smoothing and State

Accuracy Estimation

announcements
Announcements
  • Homework 11 due on Friday
    • Sample solutions posted online
  • Lecture quiz due by 5pm on Wednesday
  • Final Exam Posted On Friday
    • Due December 16 by noon
      • By 11:59pm for CAETE Students
  • Final Project Due December 16 by noon
      • By 11:59pm for CAETE Students
motivation
Motivation
  • The batch processor provides an estimate based on a full span of data
  • When including process noise, we lose this equivalence between the batch and any of the sequential processors
  • Is there some way to update the estimated state using information gained from future observations?
smoothing
Smoothing
  • Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch.
  • Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.).
  • Step 2. Start with the last observation processed and smooth back through the observations.
notation
Notation
  • As presented in the book, the most common source of confusion for the smoothing algorithm is the notation

Based on observations up to and including

Value/vector/matrix

Time of current estimate

smoothing visualization
Smoothing visualization
  • Process observations forward in time:
  • If you were to process them backward in time (given everything needed to do that):
smoothing visualization1
Smoothing visualization
  • Process observations forward in time:
  • If you were to process them backward in time (given everything needed to do that):
smoothing visualization2
Smoothing visualization
  • Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does.
  • Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.
smoothing1
Smoothing
  • Caveats:
    • If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby.
    • While this is good, it also means smoothing doesn’t always have a big effect.
  • Smoothing shouldn’t remove the white noise found on the signals.
    • It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.
smoothing of state estimate
Smoothing of State Estimate
  • First, we use
  • If Q = 0,
smoothing of state estimate1
Smoothing of State Estimate
  • Hence, in the CKF, we store:
smoothing of covariance
Smoothing of Covariance
  • Optionally, we may smooth the state error covariance matrix
smoothing2
Smoothing
  • If we suppose that there is no process noise (Q=0), then the smoothing algorithm reduces to the CKF mapping relationships:
smoothing3
Smoothing
  • Say there are 100 observations
  • We want to construct new estimates using all data, i.e.,
smoothing4
Smoothing
  • Say there are 100 observations
smoothing5
Smoothing
  • Say there are 100 observations
smoothing6
Smoothing
  • Say there are 100 observations
factors influencing filter accuracy
Factors Influencing Filter Accuracy
  • Truncation error (linearization)
  • Round-off error (fixed precision arithmetic)
  • Mathematical model simplifications (dynamics and measurement model)
  • Errors in input parameters (e.g., J2)
  • Amount, type, and accuracy of tracking data
how do we characterize our accuracy
How do we characterize our accuracy?
  • For the Jason-2 / OSTM mission, the OD fits are quoted to have errors less than centimeter (in radial)
    • How do they get an approximation accuracy?
    • Residuals?
      • Depends on how much we trust the data
      • Provides information on fit to data, but solution accuracy?
    • Covariance Matrix?
      • How realistic is the output covariance matrix?
      • (Actually, I can make the output matrix whatever I want through process noise or other means.)
preliminary discussion batch processor covariance
Preliminary Discussion – Batch Processor Covariance
  • Qualitatively, how does the mapped covariance look for the Batch processor?
solution characterization
Solution Characterization
  • Characterization requires a comparison to an independent solution
    • Different solution methods, models, etc.
    • Different observations data sets:
      • Global Navigation Satellite Systems (GNSS) (e.g., GPS)
      • Doppler Orbitography and Radio-positioning Integrated by Satellite (DORIS)
      • Satellite Laser Ranging (SLR)
      • Deep Space Network (DSN)
      • Delta-DOR
      • Others…
compare to independent solution
Compare to Independent Solution
  • Jason-2 / OSTM positions solutions generated by/at:
    • JPL – GPS only
    • GSFC – SLR, DORIS, and GPS
    • CNES – SLR, DORIS, and GPS
  • Algorithms/tools differ by team:
    • Different filters
    • Different dynamic/stochastic models
comparison of jason 2 ostm solutions
Comparison of Jason-2 / OSTM Solutions

Image: Bertiger, et al., 2010

  • 1 Cycle = approximately 10 days
  • Differences on the order of millimeters
orbit overlap studies
Orbit Overlap Studies
  • Compare different fit intervals:
orbit overlap studies1
Orbit Overlap Studies
  • Consider the “abutment test”:
example jason 2 ostm
Example: Jason-2 / OSTM
  • Each data fit at JPL uses 30 hrs of data, centered at noon
  • This means that each data fit overlaps with the previous/next fit by six hours
  • Compare the solutions over the middle four hours
    • Why?
example jason 2 ostm1
Example: Jason-2 / OSTM

Image: Bertiger, et al., 2010

  • Histogram of daily overlaps for almost one year
  • Imply solution consistency of ~1.7 mm
  • This an example of why it is called “precise orbit determination” instead of “accurate orbit determination”
ad