1 / 20

Validation Data Support Activities at the ORNL DAAC

Validation Data Support Activities at the ORNL DAAC. Bob Cook, Steve Margle, and Lisa Olsen ORNL DAAC Jeff Morisette, NASA GSFC John Dwyer, EROS Data Center Faith Ann Heinsch, University of Montana. Land Cover Validation Workshop Boston, Massachusetts February 2, 2004. ORNL DAAC?.

seamus
Download Presentation

Validation Data Support Activities at the ORNL DAAC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation Data Support Activities at the ORNL DAAC Bob Cook, Steve Margle, and Lisa Olsen ORNL DAAC Jeff Morisette, NASA GSFC John Dwyer, EROS Data Center Faith Ann Heinsch, University of Montana Land Cover Validation Workshop Boston, Massachusetts February 2, 2004

  2. ORNL DAAC? Oak Ridge National Laboratory’s (ORNL) Distributed Active Archive Center (DAAC) www.daac.ornl.gov • Archives and distributes data for terrestrial biogeochemistry and ecosystem dynamics, including in situ data for validation

  3. LAI/fPAR NPP / LC In-situ Observations Remote Sensing ? Validation LAI/fPAR NPP / LC “Assessing by independent means the uncertainties of data products” Compiling data to assist in validating MODIS land products • Good science requires understanding product accuracy / uncertainty • Explicit statements of uncertainty foster an informed user community and improved use of data • Use of global products produced by CEOS members requires characterization of each product’s uncertainty

  4. Data Support for Validation • Data needs and schedules defined by working with scientists funded to validate MODIS Products • Field investigators have responsibility to process, document, QA, and provide access to their data • ORNL DAAC provides Mercury system for search and retrieval, for both in situ and remote sensing data • MODIS ASCII Subsets of key products prepared in a format convenient for comparison with field data • When finalized move Land Validation data to long-term archive (ORNL DAAC)

  5. MercuryMetadata Search and Data Access System • Designed to provide access over the Internet to distributed research data • Low requirements for the validation investigator (data provider) • software tool available to help produce metadata and documentation • Point and click download function • Currently used for a number of activities • EOS Validation, southern Africa (SAFARI 2000), and in Amazonia (LBA)

  6. Land Validation Core Sites • 26 sites stratified by six biomes • Based on existing stations, resources, experts • FLUXNET, AERONET, BigFoot, LTER, etc. • Long-term monitoring • Jointly nominated by Instrument and Validation PIs • Remote Sensing Data • Landsat 5, ETM+, IKONOS, • ASTER, airborne sensors, • subsetted over the core sites • in situ Site Data • Collected by various groups • at core sites, available via Mercury • Networks • AERONET, FLUXNET, BSRN, etc. • Ancillary Site Data/GIS Layers • - site variables • - elevation • - land cover • - reference layers (e.g., political • boundaries, airports, water • bodies)

  7. Global FLUXNET Networkfor MODIS NPP Product Validation

  8. FLUXNET Activities • 252 towers registered from 11 networks • Gap-filled flux data available for 176 site-years from 53 sites • Database includes • Over 4,800 site ancillary values • Over 2,000 instrument descriptors • Over 5,000 literature citations

  9. MODIS ASCII Subsets: FY 2004 • Preparing and posting 9 Terra (C4) and 3 Aqua (C3) MODIS Products for 274 sites • Developing web-based data visualization tools

  10. MODIS ASCII Subsets ( 7 x 7 km) • MODAPS at GSFC (NASA) provides subsets of selected products in HDF and SIN Projection for 274 sites • ORNL DAAC subsets to 7 x 7 km and reformats from HDF image into ASCII file of pixel values and QA flags, expanding as data are collected • ASCII File posted on DAAC’s ftp site MOD12Q1 (C3) for Walker Branch, Tennessee MOD15A2 (C4) for Walker Branch, Tennessee

  11. 1 = Evergreen Needleleaf Forest (ENF) 5 = Mixed Forest BOREAS NSA 1963 burn siteCanada Tower Pixel MODIS Land Cover Grids • 7- x 7- km grids of MODIS Land Cover (MOD12Q1) • Based on C3 processing for MODIS Land Cover (UMD Classification) • Data available for other classifications and their uncertainty

  12. Demonstrations • WebMap Server • Enables users to search for and select sites that have MODIS ASCII Subsets • MODIS ASCII Subset Data Visualization • Work in progress, currently only available for LAI/ fPAR (MOD15 C4) • Grid Visualization • Time Series

  13. MODIS Validation … What works? • Growing buy-in to the ‘core sites’ concept where the field, airborne, satellite data suite is helping to address scaling issues • Exploitation of in situ networks (e.g. flux towers, AERONET, LTER/ILTER, BSRN) and field investigations (e.g., LBA, SAFARI 2000) • Simplified data access with dedicated archives and ‘easy to use’ data format (e.g., ASCII subsets) • Frequent communication between Product PIs, Validation PIs, and data facilitators • Best validation sites are those where investigators taking ground measurements are also interested in satellite products

  14. MODIS Validation … What’s in the works? • Data for validation of MODIS Land products continue to be compiled and made available • the number of sites is growing through coordination with science networks and CEOS members • Improving formal documentation and making validation results more readily available to users • Users can provide important feedback over and above formal validation investigators (e.g., through DAAC surveys and user services)

  15. Validation … on-going needs? • Promote stronger interface to the user community (e.g., through Validation Workshops) • Develop stronger connection with the scientific community to establish how accurate the products need to be and how to relay products’ uncertainty so this information can be incorporated into how they are used

  16. Best Practices for Preparing Ecological and Ground-Based Data Sets to Share and Archive • Best Practices include: 1. Assign Descriptive File Names 2. Use Consistent and Stable File Formats 3. Define the Parameters 4. Use Consistent Data Organization 5. Perform Basic Quality Assurance 6. Assign Descriptive Data Set Titles 7. Provide Documentation • Provided this document to SAFARI 2000, EOS Land Validation, BigFoot, and LBA projects • Available on-line, in booklets, Ecol. Bull. 2001 (http://www.daac.ornl.gov/DAAC/PI/bestprac.html)

  17. in situ Data for EOS Core Sites

  18. April 7, 2003 April 23, 2003 May 25, 2003 July 20, 2003 Sept. 14, 2003 Nov 1, 2003 Leaf Area Index (MOD15A2)Walker Branch, Tennessee

  19. Leaf Area Index (MOD15A2) All pixels in 5 x 5 km area included Blue – good Red – mean of 5 x 5 km area around tower Green – rejected based on clouds Walker Branch, Tennessee

More Related