1 / 14

Satellite Vegetation Phenology Products Review Meeting

This meeting aims to review and provide feedback on the satellite vegetation phenology and vegetation index products derived from multiple long-term satellite data records. The discussion will cover data and methods, product characterization, errors and uncertainty, distribution, user support, and potential applications. The goal is to improve the quality and usefulness of these Earth System Data Records.

tanikar
Download Presentation

Satellite Vegetation Phenology Products Review Meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Science Review Panel Meeting Biosphere 2, Tucson, AZ - January 4-5, 2011 Vegetation Phenology and Vegetation Index Products from Multiple Long Term Satellite Data Records Meeting Goals and Charge Kamel Didan (UA), Miura Tomoaki (UH), Friedl Mark (BU), Xioyang Zhang (NOAA), Czapla-Myers Jeff (UA), Van Leeuwen Willem(UA), Jenkerson Calli (LP-DAAC), Meyer David (LP-DAAC) NASA MEASURES #NNX08AT05A

  2. For your reference • Reimbursement • Only for invited panelists • Original receipts (leave them with us or send them by mail) Dr. Kamel Didan ECE Department, Bldg#104 1230 E. Speedway Blvd. Tucson, AZ 85721 • Lodging and Food • Restrooms • Internet and printing support • Around Biosphere 2 • Contact people: • Val Kelly, Biosphere 2 Program Coordinator • vkelly@email.arizona.edu • Office 520.838.6154 or cell 520.360.6431 • Kamel Didan, Project lead • didan@email.arizona.edu • Office 520.621.8514 or cell 520.440.9939 MEASURES VIP ESDRs Science Review Panel

  3. PROJECT SUMMARY

  4. Data and methods AVHRR record, ‘81 – ‘’99 MODIS record, ‘00 – ’13 ‘98-’02 SPOT- VGT Data to Bridge AVHRR & MODIS records Data retention filter. Only retain high quality data VIIRS ? Homogenous clusters map. Areas with same characteristics Continuity Transfer Function map 1981 -2013 sensor independent surface reflectance record MEASURES VIP ESDRs Science Review Panel

  5. ESDR Products and objectives • Support the Earth Science research community by providing • Long term, multi-sensor, reliable, and consistent Earth System Data Records (ESDR) of • Sensor independent EVI (2-band version) , NDVI, and • A set of Remote Sensing based “Land Surface” Phenology parameters LSR 1981 .. .. .. 2013 Product Characterization Errors/Uncertainty in EVI/NDVI NDVI EVI Phenology metrics (season start, length, etc…) Errors/Uncertainty in Phenology metrics EVI/NDVI & Phenology ESDRs • These ESDRs are expected to be well characterized, of high consistency, and with known error, to support studying climate change & impacts on the Earth system. • First generation (Beta) product suite to be released late 2010 early 2011.http://measures.arizona.edu & Distribution Site: http://measuresvip.cr.usgs.gov/test/index.php#) MEASURES VIP ESDRs Science Review Panel

  6. Science review panel MEASURES VIP ESDRs Science Review Panel

  7. Project Review - FORMAT • A Day+ long meeting organized along the project’s goals and activities • Our team will present the following topics: • EVI2 Algorithm • Across Sensor Continuity Algorithm • Phenology Algorithm • ESDR product format, Error & Uncertainty, Distribution, User Support, and Long Term Archives • Potential Applications • The science review panel is expected to provide feedback and critique of: • The Science Algorithms • Product format, Specification, Characterization, and Error and Uncertainty Approach • Distribution and user support system • Application and user Community needs • Miscellaneous • Where to Improve MEASURES VIP ESDRs Science Review Panel

  8. Science Review Panel - Participants • The review panel was selected based on expertise, project needs, and availability MEASURES VIP ESDRs Science Review Panel

  9. Review – Goals & Expectations • The project team expects the following: • Critique and Feedback on our choices of algorithms/Plans/Approaches and long term goals • Where should we improve? We’re particularly encouraging and seeking feedback on: • Science Algorithms • Based on your own experiences, similar and previous efforts, and community expectations • Product specifications & characterization • This has been always a serious problem with RS data, hence we’re very open to feedback • Projection, contents, data format, parameter specification, etc… • Error and Uncertainty & How to convey product usefulness • This is a fairly new focus for RS measurements that should conjure lost of discussions • How to approach validation , what is validation, (data from 80ies) when and how a product is considered validated? • How this feeds back into the production/algorithms? • Concept of consistency versus un/certainty as opposed to absolute validation • How to measure success • etc… • Data formats, distribution mechanism, user support MEASURES VIP ESDRs Science Review Panel

  10. Review – Follow up mechanism • We’re expecting the panel to provide us with a summary/document of their findings, suggestions, comments, and expectations. We assigned your names to a topic, and we can adjust these during the meeting: • The team will provide the panel with a plan on how to accommodate and implement the recommendation of the review • We will keep the panel informed on our progress and solicit further feedback (via the project web site) MEASURES VIP ESDRs Science Review Panel

  11. Review – NASA Expectations • These are reworded expectations that were developed during discussions with the NASA project managers (Diane Wickland and Martha Maiden): • Algorithms and approaches must reflect community consensus • The project team is expected to strive to implement what the “majority” sees best • The ESDRs must reflect the community wishes so they get used efficiently • Balance between the desire of no distribution delays and potential early product issues, caveats, warning, etc… • When should the product be released • What metrics should we consider as indicators of project success • No room for research and validation • Needs to transition to a future steady state beyond the project life • How to extend time series (with new data) • How to continue supporting the product suite • Core production MEASURES VIP ESDRs Science Review Panel

  12. Additional information

  13. Total of three reviews • There are plans for 2 more rounds of reviews in addition to this one • ATBD review • Focus on science algorithms • UWG style review (April 2011) during the ORNL/LP-DAAC UWG Meeting • Focus on the products, distribution, users support MEASURES VIP ESDRs Science Review Panel

  14. Thank you for taking part in this

More Related