1 / 23

Ionospheric Integrity Lessons from the WIPP

Ionospheric Integrity Lessons from the WIPP. Todd Walter Stanford University http://waas.stanford.edu. History. Ionospheric Storms and Disturbances Originally Tested Via Scenarios Simulated disturbances added to simulated ionosphere

naida
Download Presentation

Ionospheric Integrity Lessons from the WIPP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ionospheric Integrity Lessons from the WIPP Todd Walter Stanford University http://waas.stanford.edu

  2. History • Ionospheric Storms and Disturbances Originally Tested Via Scenarios • Simulated disturbances added to simulated ionosphere • Generally, large geographic features were placed near center of network • Ionospheric Algorithm Originally Based on JPL GIM Code • Tuned to work on scenarios • Live data from WRSs not yet available

  3. Ionospheric Models • Provides Truth • Good for initial algorithm validation • Very Smooth • Average TEC • Loses small-scale variations • Spatial gradients smoothed as well • Useful Tool Before Data Was Available • However Does Not Faithfully Represent Real-World Instantaneous Ionosphere

  4. Example Scenario • From “Ionospheric Specification for the Wide Area Augmentation System (WAAS) Simulation Studies” by Steve Chavin, ION GPS-96 • dTEC/dt = 0.74 TECU/min • Gradient = 0.085 TECU/km • Shell Height = 360 km

  5. Problem • Algorithm Tuned to Work on Scenarios • All passed easily • Did not Work as Well on Real Data • Required extensive retuning • Simulated Ionosphere Did Not Faithfully Reproduce Real Ionosphere • Real disturbances worse than predicted • Real slant-to-vertical errors better than predicted • Failing Scenario Could Prove Loss of Integrity • However, Passing All Scenarios Would Not Demonstrate Positive Integrity • Worst-case scenario is algorithm dependent • Does not demonstrate probability of missed detection requirement is met

  6. National Satellite Test-Bed • Prototyping Occurred During Solar Minimum • No significant ionospheric disturbances observed • Caused Us to Become Overconfident • Performance Dominated by Receiver Artifacts • Reasonability checks instituted to mitigate these errors • Too aggressive, would remove much of solar max observed behavior • Early Prototyping • Dual-Frequency Survey Receivers • Single Threaded • Initiated in 1993 • Full Deployment Started in 1996

  7. 11 Year Solar Cycle • Solar activity changes dramatically over an 11 year solar cycle • Ionosphere at the peak is much worse than at minimum • Most disturbances at peak and declining phase NSTB WIPP

  8. WIPP • At the End of 1999 FAA Certification Required a Change in the Safety Analysis • Level D code not considered reliable • Threat models required for all monitors • Rigorous accounting for monitor observability • Certification of Ionospheric Algorithms Left to Ionospheric Experts • Experts Created Threat Models From Data • Reliable threat not hypothetical • Must protect against worst observed conditions • Must overbound historical observations • Must have a demonstratable probability of missed detection

  9. Supertruth Data • 25 WRS - 3 Threads Each - Carrier Leveled - Biases Removed - Voting to Remove Artifacts • Clean Reliable Data Collected at the Peak of the Solar Cycle • Contained Worst Observed Gradients (Temporal and Spatial over CONUS) • Most Severely Disturbed Days Formed the Basis for Threat Model • Ionospheric Disturbances Are Deterministic, but Sampled Randomly • Worst cases are sampled over time • Will appear in the data as they move w.r.t IPPs • Apply Data Deprivation to Model Effects of Poor Observability

  10. Ionospheric Measurements

  11. Storm Example

  12. Differences in Vertical Delay • Difference in Vertical Delay vs. IPP Separation Distance for Two Days: Quiet Day July 2nd 2000 Disturbed Day July 15th 2000

  13. CONUS Ionosphere Threat • Not Well-Modeled by Local Planar Fit • Ionosphere well-sampled1 • Ionosphere poorly sampled2 • Ionosphere Changes Over the Lifetime of the Correction • User Interpolation Introduces Error 1. “Robust Detection of Ionospheric Irregularities,” Walter et al. ION GPS 2000 2. “The WAAS Ionospheric Threat Model,” Sparks et al. Beacon Symposium 2001

  14. Well-Sampled Ionosphere • Chi-Square Metric Acts as “Storm-Detector” • Test using small decorrelation value • Nominally ~ 35 cm one-sigma • Passing test accepts larger value • Typically ~ 85 cm one-sigma • Analytic Approach • Does not require data except as validation • Fully specified before the first storm data of April 2000

  15. Under-Sampled Ionosphere • Purely an Empirical Threat Model • Worst Storm Data Used • IPPs Removed From Estimation to Simulate Poor Sampling • Three quadrant removal schemes • Storm detector used on remaining data • Threat Based on Worst Deviation for Given Set of Metrics • Threat Region Is 5x5 Degree Cell Centered on IGP

  16. CORS Data

  17. Goal of Sigma Undersampled • To protect against an unfortunate sampling of the ionosphere such that we fail to detect an existing disturbance • Presumes the ionosphere is non-uniform near the IGP, i.e. it is divided into at least two states: a quiet one that is sampled, and a disturbed one that is not

  18. Goal of Data Deprivation • To divide the IPPs into two groups: one that samples a relatively quiet ionosphere and does not trip the chi-square, the other that samples ionosphere not well-modeled from the quiet points • Data deprivation is used to simulate conditions that were not actually experienced but may reasonably be experienced in the future • It allows us to investigate threats that occurred in well observed regions as though they had occurred in poorly observed regions • Want to have the quiet IPP distribution match those that may occur on operational system

  19. Storm Days • Over the last 5 years, ~100 active days have been identified • ~50 affected or would have affected WAAS performance • ~45 supertruth files generated • Working on files for the lesser days • 16 days affect our empirical threat model (serious effect) • All supertruth files available to international SBAS community

  20. Temporal Threat Model Also an Empirical Threat Model Worst Storm Data Used Storm detector used Look to See Largest Change With Respect to Planar Estimate Overbound of Worst Rate of Change Ever Observed Correlation with Spatial Gradient Errors are currently double counted

  21. Temporal Threat Model • No Storm Detector • > 3 m / min • > 6 m /2 min • > 7 m over 5 min • With Storm Detector • < 0.5 m / min • < 1.25 m /2 min • < 2.5 m over 10 min

  22. Post IOC Storms October 29-31 and November 20, 2003 Were Some of the Worst Storms Ever Observed Conservatism in GIVE Calculation Protected Users No HMI observed at any location None even close However Worse Than Predicted Still much uncertainty in ionosphere

  23. Conclusions • FAA Certification Required All Users Bounded Under All Conditions • Ionospheric deviations are deterministic • Ionospheric deviations are observable • Threat Models Essential for Limiting Ionospheric Behavior • Each Monitor Must Account for the Limits of Its Observability • Approach is Very Conservative • We are still learning!

More Related