1 / 12

Communicating Uncertainties for Microwave-Based ESDRs

Communicating Uncertainties for Microwave-Based ESDRs Frank J. Wentz, Carl A. Mears, and Deborah K. Smith Remote Sensing Systems, Santa Rosa CA Supported by: NASA MEaSUREs Program Carl Mears Poster : IN21A-1405: Uncertainty Estimates for MSU/AMSU Derived Atmospheric Temperatures

Download Presentation

Communicating Uncertainties for Microwave-Based ESDRs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Communicating Uncertainties for Microwave-Based ESDRs Frank J. Wentz, Carl A. Mears, and Deborah K. SmithRemote Sensing Systems, Santa Rosa CA Supported by: NASA MEaSUREsProgram Carl Mears Poster: IN21A-1405: Uncertainty Estimates for MSU/AMSU Derived Atmospheric Temperatures Kyle Hilburn Poster: IN21A-1418: Decadal Trends and Variability in Special Sensor Microwave / Imager (SSM/I) Brightness Temperatures and Earth Incidence Angle Presented at AGU, San Francisco, 2011 December 6, 2001; IN24A

  2. DISCOVER Project Distributed Information Services: Climate/Ocean Products and Visualizations for Earth Research • A collaboration between Remote Sensing Systems and University of Alabama, Huntsville • Supported by two NASA Programs: • MEaSUREs (Making Earth Science Data Records for Use in Research Environments) • Earth System Data Records Uncertainty Analysis 23 Satellite Microwave Sensor (Radiometer and Scatterometers) Products: Sea-Surface Temperature and Winds, Water Vapor, Cloud Water, Rain Rate

  3. Large, Heterogeneous User Base 5000-10000 Distinct Users Applications ranging from tracking seals and albatrosses to high precision climate monitoring 500 Peer-Reviewed Journal Papers used DISCOVER data. Geographic Distribution ofUsers

  4. Providing Information on Accuracy of Products is Essential An estimate of a variable without an assigned uncertainty is, in some sense, meaningless Usually there is an implied uncertainty, i.e.: SST error = 0.5 C, wind error = 1 m/s, etc. However, for satellite retrievals the real uncertainties are usually dynamic and complex. This variability in the uncertainty must be communicated to the Users • Multiple approaches required • Quality flags: the traditional approach • Formal errors to assess algorithm input errors • Simultaneous retrievals from multiple algorithms to assess algorithm assumption errors • For climate trends, compare results from different satellites

  5. Traditional Quality Flags • Indicate the occurrence of certain events that may effect quality • Anomalous spacecraft attitude • Anomalous on-board calibration • Close to land • Possible sea ice • Etc. • Usually include summary bits (or values) as a guide for data inclusion/exclusion • A useful, simple approach but lacks quantitative information • Need to provide Users with information on percent of data excluded • Need to development common notation/definitions among data providers • Error characterization is much more complex than can be captured by a few bits

  6. Computation of Formal Error Estimates Assessment of Algorithm Input Errors Determine sensitivity of retrieval algorithm to errors in inputs: Brightness Temperatures (TB)  Retrieval Algorithm  SST, wind, vapor, cloud, rain Ancillary data  Retrieval Algorithm  SST, wind, vapor, cloud, rain Incidence Angle  Retrieval Algorithm  SST, wind, vapor, cloud, rain Hot load temperature  Retrieval Algorithm  SST, wind, vapor, cloud, rain For every observation, retrieval algorithm is run many times to determine these sensitivities (EP denotes environmental parameter): Enter AMSR-E Mission is being completely reprocessed with a formal error assigned to each retrieval

  7. Formal Error Estimates for AMSR-E Water Vapor: A very dynamic quantity Errors increase in very MOIST AIR and also in HEAVY RAIN.

  8. Verification of Formal Error Estimates SST Error determined from: Buoys Formal Estimate

  9. Provide Simultaneous Retrievals from Multiple Algorithms Assessment of Algorithm Assumption Errors Example: Rain Rate, much of the error is due to the assumptions built into the algorithm RSS RSS - Petty Petty RSS – Petty vs RSS

  10. Inter-Comparison of Satellite Wind Time Series F13 SSMI, F16 & F17 SSM/IS, WindSat, and AMSR-E F16 SSM/I Problem AMSR-E minus WindSat ??

  11. Data Exclusion Based on Yield Along with error estimate, provide the cumulative distribution function 0.1% flagged s = 0.93 10% flagged s = 0.51 1% flagged s = 0.73

  12. Communicating Uncertainties in ESDRs: Summary • Providing Information on Accuracy of Products is Essential • No easy one answer • Uncertainties are more complex than the retrievals themselves • Quite dynamic; highly dependent on the environment • Multiple approaches required • Quality Flags • Formal Errors • Percentage Yield • Simultaneous retrievals from multiple algorithms • For climate trends, compare results from different satellites • Data Provider and User must work together to determine best approach • Pro-Active: Users must be encourage to use uncertainty information • Moderated Blog: Users and Providers share common issues and problems • Web-based support documentation on uncertainties • AMSR-E will serve as a test-bed

More Related