1 / 27

EUROCONTROL Navigation Domain

EUROCONTROL Navigation Domain. The Data Collection Network: EGNOS revealed GNSS 2004 – Rotterdam, The Netherlands Santiago Soley, PILDO Labs S.L. Summary. EUROCONTROL: GNSS-1 Operational Validation The Data Collection Network A Methodology to assess EGNOS performance First Glance

erikaj
Download Presentation

EUROCONTROL Navigation Domain

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EUROCONTROL Navigation Domain The Data Collection Network: EGNOS revealed GNSS 2004 – Rotterdam, The Netherlands Santiago Soley, PILDO Labs S.L.

  2. Summary • EUROCONTROL: GNSS-1 Operational Validation • The Data Collection Network • A Methodology to assess EGNOS performance • First Glance • Anomaly Investigation • Towards the Final Results • Conclusions and Future Work

  3. EUROCONTROL The European Organisation for the Safety of Air Navigation The European Organisation for the Safety of Air Navigation 32 Member States 31 Member States

  4. GNSS-1 Operational Validation • EUROCONTROL will coordinate the GNSS-1 Operational Validation: • To support the Member States in achieving the approval for the provision of EGNOS services throughout ECAC (European Civil Aviation Conference) • Two distinct areas of work: • Technical Validation of the performance • SARPS requirements – RNP values • MOPS DO229C compliant receiver • Definition of operational rules and procedures for aircraft to use the system for a particular application (APV implementation)

  5. The Data Collection Network: Objectives • Collect ESTB/EGNOS Data and Evaluate them from different sites in Europe • Group of experts to support: • the development of the tools need in the future EGNOS Operational Validation - PEGASUS • definition of data collection and processing procedures • Baseline for the future static campaign in the GOV Plan – extended with the Air Navigation Service Providers participation

  6. The Data Collection Network GOVANSP The Network Tromsö ESTB reference station (RIMS) EURIDIS reference station ESTB processing facilities (CPF, MCC) NLES Höfn Hönefoss IOR PRN131 Rotterdam Scilly Toulouse Lisboa Barcelona Fucino Ankara Palma Malaga Kourou Matera Canaries Hartebeesthoek

  7. The Methodology Actual performance SARPS Thresholds Integrity Accuracy Availability Continuity Data sample filtering (plausibility rules/flag) • event counting • visualisation • position domain parameters Processed data samples ExtrapolationSimulation Anomaly investigation System analysis • file watch • MT • ionosphere • ranges • clock • multipath • etc. Confidence • Integrity (range) • RNP availability • Continuity System behaviour

  8. A C C U R A C Y • Check against RNP parameters A V A I L A B I L I T Y C O N T I N U I T Y I N T E G R I T Y • test fail or pass A First Glance • Proposed algorithms • single site/day EGNOS post-SIS1 EEC, April 23rd, 2004

  9. Summary of results – Vertical Position Error 95%

  10. Summary of results – APV-II Availability (AL 20m)

  11. Anomaly Investigation • What to do if some anomalies appear that makes some of the test fail? • Just a first check on the obtained performance is not enough to declare the system not compliant with the requirements • detailed analysis on what caused a test fail • Different methods and techniques used by the Network

  12. Example 1: Jump on the VPE Jump on the position error periodically experienced Delft, December 26th, 2002

  13. Multipath Example 1: Jump on the VPE Mc combination Delft, December 26th, 2002

  14. Example 2: Integrity Failures Integrity failures: HPE>HPL Barcelona, February 6th, 2003

  15. Example 2: Integrity Failures MT02/03 anomalies • Psedorange correction oscillations broadcast in MT02 and MT03 • UDREI<12 • IODF<3 Barcelona, February 6th, 2003

  16. Ionospheric corrections for all visible satellites PRN17 in red Example 3: Integrity Failures MT26 anomalies 60 MI in about 500 epochs Barcelona, September 12th, 2002

  17. UPC1: Vertical Component Prefit- residuals Example 4: Integrity Failures 8 consecutive MI Barcelona, May 22th, 2003

  18. UDREI=14 PRN 29 PRN 29 Example 4: Integrity Failures satellite clock jump Fast Corrections: UDREI NO Alarm Conditions (IODF <3 & UDREI<15) 9 s Sudden jump in C1 code Pseudorange for PRN 29. The ESTB declares the sat. as NOT MONITORED (UDREI=14) 9 sec. after. Prefit- residuals Barcelona, May 22th, 2003

  19. Pref-Res PRN16 PRN11 Prefit- residuals PRN02 Example 5: Integrity Failures VPE/VPL Large number of MI 20:40 19:17 Barcelona, November 20th, 2003

  20. Example 5: Integrity Failures PRN16: C1-Rho (shifted) PRN16: L1-L2 (shifted) PRN16: P2-P1 (shifted) STEC (m L1) PRN16: STEC +/- UIRE (ESTB) 20:40 19:17 Barcelona, November 20th, 2003

  21. 18:00 20:00 October 29-31th November 20-21th 22:00 Extreme Very High High Medium LOW Example 5: Integrity Failures Ionospheric storm Kp index from Oct. 17th up to Dec. 1st2003

  22. Towards the final results • To validate SARPS requirements in a reasonable period of time just the data collected from the network would not be enough • simulations or other methods – get up to the required level of confidence from the measurements • Impossible to do it for all locations under all conditions • extrapolation of the local results from the network sites • Global Assessment

  23. LINUX The Global Monitoring System GPS networks 24h RINEX files Automatic data gathering BRUS PEGASUS 24h GPS and GEO binary data Automaticdailyprocessing INTERNET Slog Automatic data collecting Automatic results presentation Receiver WEB server e-mail

  24. The Global Monitoring System • Cross-check with simulations • Network results extrapolation • Global Assessment HPE 95% & Integrity Events HPL 99% & satellites used

  25. Conclusions • Summary of the activities from the Data Collection Network – Eurocontrol • gain knowledge on how the data need to be collected, processed and analysed – EGNOS Operational Validation • Methodology (3 axis) • Actual performance- First glance report • System Behaviour - Anomaly investigation • Confidence, Extrapolation – Global Monitoring System • Dynamic trials

  26. Future Work • Improving the network layout and the automation of procedures • continuous data logging • automated results data sharing - ftp • Potential data exploitation and validation of the Global Monitoring System results • harmonization with the network sites ones • Data campaigns with the first EGNOS SIS • reuse of all the ESTB lessons learned • expected that the encountered anomalies rarely happen • EGNOS SIS-1 confirms that

  27. Questions?

More Related