1 / 29

Project Context and Objectives

upton
Download Presentation

Project Context and Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BIOMASS End-to-End Mission PerformanceSimulatorPaco López-Dekker, Francesco De Zan, Thomas Börner, Marwan Younis, Kostas Papathanassiou (DLR); Tomás Guardabrazo (DEIMOS); Valerie Bourlon, Sophie Ramongassie, Nicolas Taveneau (TAS-F); Lars Ulander, Daniel Murdin (FOI); Neil Rogers, Shaun Quegan (U. Sheffiled) and Raffaella Franco (ESA)Microwaves and Radar Institute, German Aerospace Center (DLR)

  2. Project Context and Objectives • BEES: BIOMASS End-to-End (mission performance) Simulator • ESA funded project in context of BIOMASS EE-7 Phase-A study • Provide a tool to evaluate the expected End-to-End performance of the mission • Realistic, distributed scenes • Model system residual errors (noise, ambiguities, instrument stability, channel unbalances…) • Ionospheric disturbances (Faraday rotation and scintillation) • Processing • L0, L1, L1b • Ionospheric error correction • L2 retrieval • Focus on including all main effects and disturbances • Not detailed instrument simulator

  3. BEES Overview

  4. BEES Modules • “Engineering” Modules • Geometry Module: provides common geometry to all modules [DEIMOS] • Observing System Simulator (OSS-A & OSS-B) [A: DLR; B: Thales Alenia Space] • Product Generation Module(s) [DLR] • PGM-L1a • PGM-L1b • “Scientific” Modules • Scene Generation Module (SGM) [DLR+U. Chalmers] • Ionospheric Modules [U. of Sheffield] • Ionospheric Generation Module (IGM) • Ionospheric Correction Module (ICM) • Level-2 retrieval module (L2RM) [FOI] • Performance evaluation modules [DLR] • PEM-L1b • PEM-L2

  5. BEES Block Diagram OpenSF Simulation control • OpenSF drives the E2ES. This includes: • UI • Execution Monte Carlo runs. • Etc…

  6. BEES diagram: OSS • 3 sub-modules • Dummy Radar Parameter Generator (RPG) • System Errors and Sensititvity Module (SES) • Impulse Response Function Module • IRF strategy • IRF models SAR system + processing • This avoids generation of RAW data • SES strategy: model residual errors • Two OSS versions corresponding to the two industry Phase-A studies.

  7. SGM: Scene Definition 0t/ha 500t/ha 300t/ha, Clark-Evans Index 0.8 200t/ha, Clark-Evans Index 1.8 • Forest Type (Out of a Predefined List); • Mean Biomass Level (Ha level); Spatial Distribution of “single” trees each with a individual (top) Height / Biomass tag. • 100x100 m: • Biomass (t/ha) • Tree height (h100) To forward model

  8. SGM output (ground truth) Biomass Tree height (H100)

  9. Input to PGM: PolInSAR covariance matrices σHH σHV σVV

  10. Input to PGM: PolInSAR covariance matrices ρHH1-HH2 ρHV1-HV2 ρVV1-VV2

  11. BEES Block Diagram: PGM

  12. Review of PGM algorithm inputs macro steps Generation of interferometric/polarimetric channels for the scatter (correlated) and the noise (uncorrelated) SGM, OSS Spectral shift modulation (geometric decorrelation part I) GM 2-D convolution OSS Spectral shift demodulation (geometric decorrelation part II) L1a product generation GM Add ionospheric phase screen (scintillations) and Faraday rotation IM, GM Ambiguity stacking OSS Additional system disturbances (cross-talk, phase and gain drifts…) OSS ICM L1b product generation (multilooking) GM

  13. SLC – HH Coherence – HH-HH Tree Height Multichannel signal simulation Spatial convolutions Channel Linear Combination channel #1 channel #1 channel #1 channel #2 channel #2 channel #2 channel #N channel #N channel #N Independent channels (complex) Correlated channels (complex) Desired spectral properties for each complex channel

  14. Aperture length Introduction of Ionospheric distorion Orbit …but this part for Target 2 Modifies this part of the raw data for Target 1 Ionospheric distortion cannot be applieddirectly to raw data!!! (the raw data distortion is target dependent) For an orbit at Ionosphere height Distortions can be applied directlyto the raw data Equivalent Aperture Lower (virtual) orbit Ionosphere (modeled as a layer) This part of the ionosphere Aperture angle: This is what really matters! Target 1 Target 2

  15. BEES Block Diagram For a given spectra random realizations are generated. • This block applies the ionospheric correction (Faraday rotation and shifts only). The simulation of the Ionosphere is divided in two steps. First the spectral coefficients describing the state of the Ionosphere are generated.

  16. Level-2 Retrieval Discussed in previous talk!

  17. L2 retrieved heights (H100) SGM Software bug or realistic feature? L2 Range dependent H100 bias

  18. L2 retrieved biomass SGM L2

  19. Performance Evaluation (L1b) • L1b performance in terms of element-wise covariance matrix errors • Bias • Standard deviation • In example • Significant coherence loss, due to spectral shift

  20. Performance Evaluation (L2) • L2 performance in terms of biomass and tree height errors • Bias • Standard deviation • Error statistics vs. range and biomass levels • In example • Height error leads to biomass error?

  21. Performance Evaluation (L2) • L2 performance in terms of biomass and tree height errors • Bias • Standard deviation • Error statistics vs. range and biomass levels • In example • Height error leads to biomass error?

  22. Monte Carlo (multiple runs of BEES) • Monte Carlo simulations are implemented by OpenSF • BEES is run repeatedly perturbing (if necessary) some input parameters. • Perturbation approach • Random realizations implemented by modules (OpenSF can provide varying seed for independent realizations). • This gives the control of the randomization to the module developers in order to ensure physical correctness. • Most of this randomness is introduced by IGM and PGM

  23. Notes on Validation

  24. Validation: challenges and strategy • BEES is a complex software tool comprising modules developed by different teams under heterogeneous environments • How do we know that the outputs are correct? • We are developing the tool because we do not know (exactly) what we will get! • We are simulating a random process: • Speckle • Random noise • Random hardware disturbances • Random realizations of Ionosphere • … • Validating the software requires approaches that resemble the post-launch validation/calibration of a real system • Homogeneous scenes • Point targets • Validation needs to check if resulting statistics for some canonic cases are in agreement with theory.

  25. Example: NESZ validation • NESZ is range dependent The threshold is designed for a failure probability of 10-3 test failure test success The nominal NESZ value test failure

  26. Example: PGM L1b Verification Probabilistic Threshold • Due to random nature of speckle, the estimated covariance matrices will not be identical to the true one (even when all error sources are turned off) • We can however evaluate the likelihood of a certain output given the input in probabilistic terms (e.g. using confidence intervals). • We will do the test using the complex coherences, i.e. the normalized elements of the sample covariance matrix: • Using a probability threshold (th), it is possible to bind the deviation: • The threshold will be a function of the desired error (t), the input coherence (γ) and the number of looks (L).

  27. PGM L1b Verification – Caveat! • The assumption that the estimate is unbiased doesn’t hold for high coherences and low number of looks. • For a given coherence one has to make sure that enough looks are taken into account, i.e.: histograms from simulations 105 simulations, gamma=0.5, L=250 105 simulations, gamma=0.95, L=30 To validate the simulator we need (to simulate) large, homogeneous scenes! Sound familiar?

  28. Project Status/Outlook • Software almost completed • Full handling of ambiguities missing • Some ionospheric features/possibilities pending • Validation and debugging on-going • Distinguishing between bugs and features not easy! • Mission Performance Assessment • Once BEES is validated it will be used to assess mission performance for both Phase-A designs • Hundreds of test cases requiring “N” Monte Carlo repetitions • Weeks of simulation time

More Related