1 / 41

Data Support for Validation of EOS Land Products

Data Support for Validation of EOS Land Products. Larry Voorhees, Dick Olson, Bob Cook, ORNL * DAAC Jeff Morisette, NASA GSFC John Dwyer , EDC DAAC Terra Data Products Review NASA HQ January 7-8, 2002. * Oak Ridge National Laboratory is managed by UT-Battelle, LLC,

zasha
Download Presentation

Data Support for Validation of EOS Land Products

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Support for Validation ofEOS Land Products Larry Voorhees, Dick Olson, Bob Cook, ORNL* DAAC Jeff Morisette, NASA GSFC John Dwyer, EDC DAAC Terra Data Products Review NASA HQ January 7-8, 2002 * Oak Ridge National Laboratory is managed by UT-Battelle, LLC, for the U.S. Dept. of Energy under contract DE-AC05-00OR22725.

  2. Jeff Privette, Jeff Morisette, Nazmi El Saleous, and Robert Wolfe, NASA GSFC John Dwyer, EDC DAAC Steve Running, U Montana Ranga Myneni, Boston U Alfredo Huete, U Arizona Dennis Baldocchi, UCLA Tom Boden, AmeriFlux ORNL Many other Science Team and EOS Validation Investigators Contributors and Participants

  3. Summary • Approach for supporting validation has been designed, reviewed, and tested • Distribution systems in place and working • Data for validation are available and more data are being added • MODIS Science Team, EOS Validation PIs, and others are using the systems • Plans are in place to communicate validation results via special journal issues and workshops • Need to extend correlative analysis to global network of sites

  4. Outline • Validation Approach • Data Available for Validation • high resolution imagery • in situ data • Intercomparison Activities • Data Use • Communication • Path Forward • What works & what needs to be improved?

  5. In-situ Observations Remote Sensing ? LAI/fPAR NPP LAI/fPAR NPP Validation “Assessing by independent means the uncertainties of data products” • MODIS Land Validation: Level 2 to 4 data products primarily ecosystem variables • Successful validation depends on determining measurement uncertainties for specific uses • Validation of remote sensing land products at a global scale is an unprecedented scientific challenge

  6. To compare MODIS products with parameters derived from higher-resolution airborne or satellite sensors and data from ground-based studies at validation ‘core sites’, flux towers, data from literature, etc. Approach

  7. Land Products Validation Primary Products Supported Based on MODIS Land Team Validation Update for Terra and Aqua, December 2000

  8. EOS Land Validation Core Sites • Based on existing stations, resources, experts • FLUXNET, AERONET, BigFoot, LTER, etc. • Long-term monitoring • Jointly nominated by Instrument and Validation PIs • 26 sites stratified by six biomes • Promote resource sharing and synergies in validating multiple sensors and products • Imagery and in situ data available for many uses, not just validation

  9. Data for EOS Land Validation Core Sites • Remote Sensing Data • LandSat 5, ETM+, IKONOS, ASTER, • airborne sensors, subsetted over • the core sites • in situ Site Data • Collected by various groups • at core sites • Networks • AERONET, FLUXNET, etc. • Ancillary Site Data / GIS Layers • - site variables • - elevation • - land cover • - reference layers (e.g., political • boundaries, airports, water bodies)

  10. Remote Sensing Data for Core Sites

  11. Validation Subsetting by MODAPS • EOS Core Sites (200 x 200 km) to EDC • EOS Core Sites and GT-NET sites (11 x 31 km) to UMt and ORNL • LAI Inter-comp. Sites (200 x 200 km) to EDC • SAFARI 2000 Sites (200 x 200 km) to UMd and South Africa Disaster Management Center • Provided in hdf format, • integerized sinusoidal projection

  12. Posting MODIS Productsat ORNL DAAC • MODIS Products: • Veg Index • fPAR / LAI • PSN / NPP • BRDF / Albedo • Surface • Reflectance • Surface • Temperature • LC/LCC • Subset to 7x7 km, re-projected to UTM to relate to field work • ORNL worked with EDC to refine MODIS Reprojection Tool • Reformat into ASCII • Provide a single expanding table for each site-product • rows representing dates • columns values for the 49 pixels • includes QA flags UTM MODIS sinusoidal

  13. All pixels in 5x5 km cutout Green = rejected Blue = accepted Data not available MODIS 8-day LAI Product The scatter of blue (3x3 km) and green (5x5 km) dots shows variability in LAI around the flux tower. Accepted pixels based on QA flags LAI of center pixel Park Falls forest site 08NOV2000 – 16OCT2001

  14. ORNL DAAC in situ Data • Data needs and schedules defined by working with MODLAND Validation scientists • Investigators have responsibility to process, document, QA, and provide access to their data • Provide Mercury system for search and retrieval, for both in situ and remote sensing data • Eventually move Land Validation data to long-term archive (ORNL DAAC) • Also provide Regional and Global data sets for modeling as part of ORNL DAAC mission

  15. in situ Data for Core Sites

  16. Site Characteristics Data Most core validation sites belong to several networks that provide ancillary information • ORNL is compiling and providing site characteristics data for the Core Sites • NPP, LAI, fPAR, phenology, micrometeorology, climatology, soils, vegetation cover, etc. • Data can be used for validating other sensors or for global change research • Assemble metadata • documenting methods, processing, uncertainty, data source • examining relationships between parameters for QA checks

  17. FLUXNET 157 Towers

  18. ORNL DAACValidation Data NPP – 1,000 site estimates • Historical data from literature • Net Primary Productivity (point and grid) • Leaf Area Index (LAI) • Regional and Global data sets • ORNL DAAC archive • Climate (7) • Vegetation (~100) • Soils (9) • Hydrology (3) • 119 data sets registered in Mercury re veg, land use, soil, climate, hydrology, etc. NPP – 7,000 0.5º grid cells LAI – 1,000 estimates

  19. BigFoot: Linking in situ Measurements, Remote Sensing, and Models • Using ground measurements, remote sensing, and ecosystem models • Measuring land cover, LAI, fPAR, and NPP • Scaling from field measurements to 5 x 5 km grid BigFoot

  20. SAFARI 2000 • Coordinated field campaign in southern Africa with links to MODIS Land and MODIS Atmosphere validation • Intensive campaigns in 1999 - 2001 to examine land-atmosphere interactions • ORNL DAAC provides Mercury for access to: • in situ data collected at field sites • airborne sensors (MAS, AirMISR) • air chemistry measurements

  21. GT-Net NPP Project • Coordinated by GTOS with assistance from ILTER • Use hierarchical modeling approach that combines satellite data and in situ observations • Sites agreeing to measure NPP in exchange for MODIS products for their site • 25 sites in demo phase

  22. LAI/FPAR Network • Joint activity of CEOS LPV and MODLAND • 31 sites participating worldwide • MODLAND coordinating Landsat ETM+ acquisitions and MODIS LAI subsets • 12-18 month pathfinder • leverage existing field campaigns • Workshops:1998 to 2001 • Proposed MODIS LAI to field LAI inter-comparison

  23. MODIS - Model - Data Intercomparison • To validate Level 4 Land Products, daily photosynthesis and annual NPP • Validating results of models embedded in NPP algorithm against other models • Activities: • Real-time Validation • Ecosystem Model Data Intercomparison (EMDI) • LAI-Net • GT-NET activities

  24. Individual Site Data Flux-Model-MODIS Comparison 17 sites • CDIAC/AmeriFlux • Model initialization parameters • Weekly micromet data • Initial 0.5 hr flux data • Modeling Groups • 5 groups, more coming • Results posted on AmeriFlux • MODAPS • 52 sites • 8 products • 11 km x 31 km • ISIN, .hdf • DAAC/FLUXNET • MODIS products – 7x7 cutouts • FLUXNET gapfilled data • Links to field data via Mercury Comparison Workshop ?April 2002 Additional validation data

  25. User Statistics for Validation Data • Use of Mercury • Referrals to registered data and documentation • Data downloads of remote sensing images of core sites

  26. Aber, John Ahl, Douglas Augustine, John A. Baldocchi, Dennis Bruegge, Dr. Carol J. Burrows, Mr. Sean (5) Cohen, Warren B. (4) Conel, Dr. James E. Feldkirchner, Drew C. (3) Gaitley, Barbara J. Gower, Dr. Stith T. (8) Gregory, Matthew J. Honda, Dr. Yoshiaki Huete, Dr. Alfredo R. (6) Jenkins, Julian Kajiwara, Dr. Koji Kirschbaum, Alan A. (3) Knjazihhin, Yuri Liang, Dr. Shunlin Mackay, Dr. David Scott Maiersperger, Thomas K. (4) Meyer, Dave (4) Moran, Susan Morisette, Dr. Jeffrey T. (11) Myers, Daryl R. Validation Investigators Whose Data are Registered in Mercury (number of data sets indicated if >1) Myneni, Ranga B. Olson, Richard J. (2) Privette, Dr. Jeffrey L. (4) Qin, Wenhan Robbins, Cullen (4) Thome, Dr. Kurtis J. Townshend, Dr. John R.G. Turner, David P. Walter-Shea, Elizabeth A. Wylie, Bruce (3) al-Abbadi, Dr. Naif

  27. Validation Data through Mercury • 57 validation data sets registered in Mercury • Land Validation – 49 data sets • SAFARI 2000 – 5 data sets from NRA selectees • ORNL DAAC Regional and Global Data – 3 • User statistics • July – December, 2001 • 1,386 searches • 333 users Mercury Searches

  28. Types of Referrals July 1 – November 30, 2001 (388 referrals, 49 unique users)

  29. Most Referred Data Sets July 1 - December 31, 2001 (388 referrals, 49 unique users)

  30. Products downloaded from EDC by Core Site since Oct 99 Total Volume MODIS Files ETM+ Scenes 385 95 30 GB Source: MODIS Science Team Meeting, Dec 01

  31. Communications • Guidance for data providers • Publishing validation results • ‘Popular articles’ for the general user community

  32. Guidance for Data Providers • Archiving Ecological Data and Information. Olson and McCord. Chapter VII. In Michener and Brundt (ed.) 2000 “Ecological Data”. Blackwell Science. • Managing Data from Multiple Disciplines, Scales, and Sites to Support Synthesis and Modeling. Olson et al. 1999. RSE. • BigFoot Field Manual. Campbell et al. • 1999. ORNL TM • Guidelines for Submitting Tabular • Data. Cook et al. 2001. ESA Bulletin • Mercury help and tutorial Web pages

  33. Publishing Future Validation Results • Special section in IEEE TGARS • Spring/Summer 2002 • requirement that all related data be registered and available via Mercury • EMDI Workshop, April 2002 • SAFARI 2000 Meeting, October 2002 • Flux/Model/Data Intercomparison Workshop ? • Other venues?

  34. August 2, 2001 Earth Observatory and NASA News articles feature: • MODLAND validation • FLUXNET database by Rachel Hauser, September 19, 2001 http://earthobservatory.nasa.gov/Study/MODISCalibration by Rachel Hauser, September 24, 2001 http://earthobservatory.nasa.gov/Study/Fluxnet/ New NASA Field Campaign Sees the Forest for the Satellite

  35. Path Forward

  36. Data resources for on-going validation (Terra, Aqua, NPP, etc.) • Make use of extant data • EOS Land Validation Core Sites should be maintained, although EOS Investigations have either expired or will in early 2002 • Without formal “validation investigators” we need stronger connections with users who can provide feedback • FLUXNET and BigFoot have continued funding • More collaboration with CEOS Land Product Validation Subgroup

  37. Land Product Validation Subgroup • Increase the quality and economy of global satellite product validation • develop and promote international standards and protocols for field sampling, scaling, error budgeting, data exchange, and product evaluation • Advocate mission-long validation programs for current and future earth observing satellites • Pilot projects • LAI • Fire / burn scars • Land cover Using EOS Land Validation Core Sites as an example to establish the ‘CEOS Land Validation Core sites’

  38. Validation . . . what works • Growing buy-in to the ‘core sites’ concept • Exploitation of in situ networks (e.g. flux towers, AERONET) and field investigations (e.g., LBA, SAFARI 2000) • International partnerships and coordination • Simplified data access with dedicated archives • Frequent communication between Product PIs, Validation PIs, and data facilitators

  39. Validation . . . on-going concerns • 26 core and ~150 towers sites is a relatively small number of sites, but it still takes a lot of effort • Spatial differences / scaling - points vs. cells, and the need for a wide variety of ecosystem sites • Temporal differences - satellite vs. field data • Data inconsistencies - methods, variables, formats • More data / better knowledge = greater cost • How much is enough? • Global representation – how many points in time, space, and biome?

  40. Validation . . . on-going concerns (con’t) • Control expectations • It takes a couple of years to validate a 1-yr Level 4 product • What’s the end point for validation? . . . will continually refine? • Stronger interface to the user community and on-going collaboration with the international community • Improve formal documentation and availability of results • Consider establishing a ‘permanent’ NASA Office on Calibration and Validation

  41. Comments . . . . Questions?

More Related