1 / 50

October 31- November 1, 2007 NASA/GSFC Building 16W Room N76/N80

NPOESS Preparatory Project (NPP) Science Data Segment (SDS) Critical Design Review (CDR) Atmosphere PEATE. October 31- November 1, 2007 NASA/GSFC Building 16W Room N76/N80.  NPP Atmosphere Team Goals for Atmosphere PEATE.

egwen
Download Presentation

October 31- November 1, 2007 NASA/GSFC Building 16W Room N76/N80

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NPOESS Preparatory Project (NPP) Science Data Segment (SDS)Critical Design Review (CDR)Atmosphere PEATE October 31- November 1, 2007 NASA/GSFC Building 16W Room N76/N80

  2.  NPP Atmosphere Team Goals for Atmosphere PEATE • The primary purpose of the Atmosphere PEATE is for evaluation of official NGST cloud and aerosol (i.e., suspended matter) products. • Note that NGST cloud/aerosol products have not been produced globally for evaluation. • Path chosen by NPP Science Team members for application at the Atmosphere PEATE: • a. Use MODIS/AIRS as proxies for VIIRS/CrIS • b. Provide capability to intercompare cloud/aerosol products with active sensors (CALIPSO and CloudSat) • c. Identify areas where agreement is poor for further evaluation • d. Modify algorithms and test on global data, over multiple months • In this fashion, make continual progress towards global evaluation and improvement of historical algorithms (such as for AVHRR), operational MODIS products, and NGST approaches.

  3.  Considerations for Accurate Cloud and Aerosol EDR/CDRs  • Attention to Detail: • a. orbital drift • b. inter-satellite calibration • c. well-mixed gases changing concentration over time • d. spectral shifts between successive instruments for channels • of the “same” nominal wavelength • e. improvements in forward RT models • f. understanding ancillary data • (e.g., global 1-km albedo and emissivity maps now available) • g. sampling (treatment of data gaps) • …all of which lead to assessment of error characteristics

  4. Evaluation of Aerosol/Cloud Properties Using A-Train Sensors • Cloud mask (CALIPSO; currently available) • Cloud layering (CALIPSO/CloudSat; currently available) • Cloud top height (CALIPSO/CloudSat; currently available) • Cloud base (CALIPSO for optically thin clouds, CloudSat for thicker clouds) • Cloud thermodynamic phase (CALIPSO product imminent) • Cloud and aerosol optical thickness ( - need extinction profile product from CALIPSO - for  < ~3 from CALIPSO; for higher  from CloudSat • Cloud particle radius (ongoing research with both CALIPSO & CloudSat) Subject to approval: • Aerosol/cloud discrimination (CALIPSO product imminent) • Aerosol layer height (CALIPSO product imminent)

  5. Re-Process Satellite Sensor Data Modify/Test Multiple Algorithms Evaluate Granule Results PEATE Evaluate Global Results

  6. Intercomparison of CALIPSO CALIOP and Aqua Data CALIOP data: 80 m resolution MODIS cloud products at both 1 & 5 km Process goes like this: Determine mechanics of how to link observations from two different spaceborne platforms (i.e., Aqua and CALIPSO) Link viewing geometry to obtain correspondence between observations Strip out the appropriate data products (may mean multiple granules) Perform intercomparison* * Assumes understanding of the data, retrieval algorithms, and products CLOUDSAT AQUA CALIPSO PARASOL AURA 75 Seconds

  7. NPP Science Team Counterparts for Atmosphere PEATE Bryan Baum (UW): VIIRS Cloud Retrievals Christina Hsu (GSFC): VIIRS Aerosol Retrievals Hank Revercomb (SSEC): SDR Validation Omar Torres (UMBC): Aerosol Validation Paul Menzel (UW): VIIRS and heritage CDRs

  8. Atmosphere PEATE Organization Project Management Hank Revercomb (PI) Liam Gumley (Co-I, PM) Algorithms & Validation Bob Holz Richard Frey Bryan Baum Paolo Antonelli Andy Heidinger Mike Pavolonis Dave Tobin Computing Systems Scott Mindock Steve Dutcher Bruce Flynn Rick Jensen Operations Jerry Robaidek Rosie Spangler Dee Wade

  9. Land PEATE 1. Albedo (Surface) 2. Land Surface Temperature 3. Snow Cover and Depth 4. Surface Type 5. Active Fires (ARP) 6. Ice Surface Temperature 7. Vegetation Index 8. Aerosol Optical Thickness 9. Aerosol Particle Size Ocean PEATE 10. Ocean Color/Chlorophyll 11. Sea Surface Temperature Ozone PEATE 12. Ozone Total Column/Profile Atmosphere PEATE 13. Suspended Matter 14. Cloud Cover/Layers 15. Cloud Effective Particle Size 16. Cloud Top Height 17. Cloud Top Pressure 18. Cloud Top Temperature 19. Cloud Base Height 20. Cloud Optical Thickness Sounder PEATE 21. Atm Vertical Moisture Profile 22. Atm Vertical Temp. Profile 23. Atm Vertical Pressure Profile NPP Environmental Data Records (EDRs) VIIRS Intermediate Products Cloud Mask

  10. The NPP Atmosphere PEATE is implemented within the framework and facilities of the Space Science and Engineering Center (SSEC) at the University of Wisconsin-Madison. SSEC has been successfully supporting operational, satellite-based remote-sensing missions since 1967, and its capabilities continue to evolve and expand to meet the demands and challenges of future missions. Atmosphere PEATE Facilities • SSEC employs > 200 scientists, engineers, programmers, administrators and IT support staff. • SSEC projects currently support GEO: GOES 10/11/12/R; Meteosat 7/9; MTAT-1R; FY 2C/2D; Kalpana LEO: NOAA 15/16/17/18, Terra, Aqua, NPP, NPOESS

  11. Provide environment for pre-launch testing and evaluation of operational atmosphere EDR algorithms Allow rapid post-launch evaluation and comparison of NPP atmosphere EDRs Create infrastructure using available validation data to allow rapid assessment of NPP EDR products Assist NPP Science Team in assessing the suitability of NPP atmosphere EDRs for continuing the climate record of cloud observations from space Provide environment where NPP Science Team can test alternative EDR algorithms on climatologically significant samples of global proxy data Assist NPP Science Team in providing improved or alternative EDR algorithms to the IDPS Atmosphere PEATE Overview

  12. Changes since SDS PDR Completed System Requirements Review Completed Preliminary Design Review Deployed and tested Atmosphere PEATE Science Processing System Ingested Aqua MODIS global Level 1A data since Jan 2006 Processed one month of global Aqua MODIS proxy data using MODIS, MODIS VIIRS-like, and VIIRS OPS cloud mask algorithms Adapted existing software (LEOCAT) to create infrastructure for running VIIRS OPS EDR code on Linux Created EDR validation plan and demonstrated a validation approach for cloud mask Demonstrated process for evaluating a VIIRS Atmosphere EDR (Cloud Mask) using collocated CALIPSO lidar

  13. CALIPSO Science Team NPP Atmosphere Science Team SD3E CLASS(ADS) Casa-NOSA AncillaryDataProviders I&TSE Atmosphere ScienceCommunity NICSE PSOE Atmosphere PEATE Interface Diagram Analysis Results, Proposed Algorithm Updates xDRs, IPs, Ancillary Data Management Direction xDRs, IPs, Ancillary Data (if unavailable from SD3E) AtmospherePEATE Pre-flight Algorithms, Data, Info Software, Data Ancillary Data xDR Eval. Results, Algorithm Updates Calibration Updates and Evaluations Interaction Derived Products Algorithm Updates, Test Requests & Results Collocations

  14. SD3E: receive real-time VIIRS RDRs, SDRs, EDRs CLASS: receive archived VIIRS RDRs, SDRs, EDRs Ancillary Data Providers: receive ancillary data required by VIIRS EDRs NICSE: receive information on VIIRS instrument performance I&TSE: receive operational algorithm updates; send improved algorithms PSOE: send EDR evaluation reports, proposed algorithm updates CasaNOSA: receive pre-launch algorithm source code and test data NPP Atmosphere Team: receive guidance on EDR evaluation strategy; send EDR evaluation results Atmosphere Science Community: receive improved and alternative algorithms CALIPSO Science Team: receive CALIOP cloud products; send VIIRS EDR / CALIOP comparison results Atmosphere PEATE Interface Summary

  15. Interfaces: SD3E, CLASS, Ancillary Data Providers Messaging: Any request or report requiring email interaction will be handled by SSEC Data Center (DC) Staff (dcstaff@ssec.wisc.edu). Problem reports, system status notices, subscription requests, and transfer errors all fall into this category. DC Staff are available 0730 - 2300 Central Mon-Fri. DC will escalate issues to the PEATE team only if DC can’t solve them. File Transfers: Pull transfers: DC will inaugurate and monitor any regularly scheduled downloads from the SD3E 32-day data store, CLASS archive, or Ancillary provider site. These downloads will be handled by the PEATE Ingest subsystem. Push transfers: DC will request and monitor any subscriptions which have been established with the SD3E or CLASS. Checksums and digital signatures generated at SD3E, CLASS, or Ancillary provider will be automatically ingested and verified. Requests for retransmits will be routed through DC staff. The PEATE Ingest subsystem will also generate in-house checksums based on MD5. SD3E, CLASS, and Ancillary provider file name conventions will be maintained. Files will be compressed internally using bzip2.

  16. Network Performance: NASA GSFC to SSEC 2007 Demonstrated throughput from GSFC DAAC is > 50 Mbps Current average data flow is 15 Mbps

  17. Data Volumes to Atmosphere PEATE (GB/day) RDR VIIRS: 276 SDR VIIRS M-bands (10%): 26 VIIRS geolocation moderate resolution (10%): 12 EDR Clouds and Suspended Matter: 15 RIP (all required for VIIRS EDRs, 5%): 50 Total = 379 GB/day = 35 Mbps (well within demonstrated capability)

  18. Interfaces: NICSE, I&TSE, PSOE, CasaNOSA NICSE is the primary source for pre-launch VIIRS characterization data and will be the focal point for information on in-orbit instrument performance. Notifications of LUT changes and LUT files will be obtained from NICSE as they become available. I&TSE will be the primary site for post-launch updates of operational RDR, SDR, and EDR algorithms, documentation, and test data. PSOE will be the primary recipient of SDR and EDR evaluation reports prepared by the Atmosphere PEATE in conjunction with the NPP Science Team. CasaNOSA is the primary site for pre-launch operational algorithm drops, chain test data, and documentation.

  19. Interfaces: NPP Science Team and Atmosphere Community Product Evaluation and Algorithm Testing/Improvement The Atmosphere PEATE will provide a development server for use by the NPP Science Team Atmosphere subgroup for interactive product evaluation (e.g., Matlab, IDL, ENVI) and testing of improved EDR algorithms (C, C++, FORTRAN). The system will also provide source code version management (SVN) and a modest online disk archive (<10 TB) for algorithm testing. Atmosphere Community members will interact with the PEATE via the NPP Science Team. EDR Product Generation The Atmosphere PEATE Science Processing System is currently envisioned as a “super-user” system. This means NPP Science Team members will deliver compiled EDR code to the system, where a system manager will run the code on the dataset requested by the investigator. Products will be made available on a PEATE FTP site. Data Search and Order The Atmosphere PEATE will provide a web-based interface for the NPP Science Team to search current data holdings and order files for FTP push or pull. Products not available online will be recreated as necessary. Products will be made available to the Atmosphere Community as deemed necessary by the NPP Science Team.

  20. Atmosphere PEATE Science Processing System • Assess Cloud EDRs for their ability to support the system long term data trending • Enable creation of consistent long-term cloud property datasets for EDR evaluation • To evaluate climate quality of atmosphere EDRs, must use a consistent version of the calibration and science algorithms for a long-term dataset (e.g., one month, one year, entire mission). • EDRs must be created rapidly in order for Science Team to give timely feedback to NPP project on algorithm performance. • NPP Science Team members do not have the individual resources to host large datasets, integrate operational NPP algorithms, and test improved/alternative algorithms. • The NPP cloud products must be put into context with historical and ongoing global cloud property datasets (e.g., PATMOS-X, UW-HIRS, MODIS Terra/Aqua,) to create self-consistent climate data records (CDRs).

  21. APSPS Logical Design

  22. Processing System Trade Studies and Key Decisions • Examined NASA Ocean SDPS and MODIS Land Processing Systems. • Lessons learned: • Recipe-based approach to running science algorithms (system doesn’t care what the algorithm is, as long as it knows how to assemble the ingredients to make the recipe) • Cluster of compute resources (no need for a large shared memory computer) • Decouple the components of the processing system (store, compute, distribute) • Use commodity hardware/software (e.g., Rackmount Intel/AMD servers, Linux) • Key Design Decisions: • Create a system where individual components have loose dependencies on each other. • Leverage existing cluster processing hardware infrastructure and knowledge base. • Create a system which is scalable, efficient, and cost effective.

  23. APSPS Components DMS: Data Management System Stores Data CRG: Computational Resource Grid Processes Data ARM: Algorithm Rule Manage Applies Product Rules to Data ING : Ingest System Brings Data into System

  24. Science Processing System Preliminary Design Ingest Data Manage Data Manage Processing Process Data

  25. Atmosphere PEATE: Prototype Hardware (July 2007) Based on commodity hardware: Dual/Quad core servers, SATA RAID, Gigabit Ethernet. By NPP launch: 250 CPU cores, 215 TB disk. 50 CPU cores 40 TB disk

  26. At-Launch PEATE Hardware • Compute Resources: • 250 CPU cores, AMD or Intel, 2GB RAM per core. • Justification: 50 CPU cores yielded  50x EDR generation rate in prototyping studies. We will have 10 Atmosphere EDRs from NPP, and we desire 100x EDR generation rate. • 50 CPU = 4 EDRs at 50x • 100 CPU = 4 EDRs at 100x • 250 CPU = 10 EDRs at 100x • Storage: • 215 Terabytes (TB). • Justification: Aqua MODIS L0 volume is 18.25 TB/year compressed (ref: Ocean SDPS). We desire complete Aqua MODIS L0 archive online (7.5 years), plus 3 years of space for NPP RDR (complete) and SDR+EDR (subset). Estimate NPP at 25 TB/year. • (18.25 TB/year x 7.5 years) + (25 TB/year x 3 years) = 215 TB. • Networking: • Cisco Catalyst Stackable Gigabit Switch Infrastructure (32 Gbps stack interconnect) with 4 x 48 ports.

  27. Science Processing System: Lines of Code Why so small? 1. Java libraries 2. Open Source code contributions are not counted (e.g., AXIS Web Services, Hibernate, Tomcat, Log4J)

  28. SSEC Computing Security Plan • SSEC network is segmented into various security zones protected by a firewall. This is a first layer of defense, and allows for more critical infrastructure to be protected differently than guest computers for example. • Host based firewalls are used where appropriate. • Routine antivirus scanning for Windows machines is required, and maintained by TC. • Operating systems are patched routinely (TC monitors for remote root-level exploits). • Telnet, non-anonymous FTP, and other plaintext protocols are disabled or disallowed to avoid password exposure. • Systems are configured with minimal network services required to fill their operational requirements. • Routine password audits are performed by TC. • All systems are backed up, to aid in disaster recovery. Critical backups (source code repositories) are mirrored offsite.

  29. Algorithm Lifecycle Algorithm can come from anywhere. Once qualified, the algorithm can be applied from ARM.

  30. Algorithm Ingest Algorithm entered into subversion Product created in bugzilla Algorithm is ported and wrapped Tests are created

  31. Algorithm Qualification Write a script to execute algorithm Script manages execution environment Algorithm name, inputs and outputs entered into ARM

  32. LEOCAT: Low Earth Orbit Cloud Algorithm Testbed LEOCAT History • Developed by Mike Pavolonis at UW under VIIRS IGS funding to investigate differences in the operational VIIRS cloud algorithms and heritage algorithms in a manner that isolates algorithmic differences. • The best way to accomplish this is to apply each algorithm to the same Level 1B and ancillary data sets using the same radiative transfer model. • A secondary use of LEOCAT is to serve as an algorithm development tool and global EDR processing system. • LEOCAT approach is also being used for GOES-R AWG work (GEOCAT). LEOCAT Features • Handles multiple imaging sensors (e.g., MODIS, VIIRS, AVHRR) • Multiple algorithms with the same and/or different parameters can be executed in one instance • Allows for the addition of new algorithms with minimal programming effort (can be added as shared libraries) • Produces HDF4 output • Can use CRTM or PLOD forward models • Can use GFS, NCEP, or ECMWF ancillary data • Optimized for efficiency • Process a single or multiple granules in one instance

  33. LEOCAT Architecture LEOCAT Core Science Algorithm

  34. Atmosphere PEATE Development Schedule

  35. Atmosphere PEATE Development Schedule

  36. Atmosphere PEATE Development Schedule

  37. Atmosphere PEATE Development Schedule

  38. EDR Evaluation Goals • The PEATE will be designed to identify algorithm/instrument issues from the physical sensitivity differences between the evaluation and VIIRS products • The VIIRS science team will be enlisted to establish protocols so that cloud product inter-comparisons are performed similarly for all algorithms • The goal is to automate the product inter-comparison process • The evaluation results will be compiled for each VIIRS processing run using established protocols • Graphics (figures) and comparison statistics will be automatically generated for review allowing for instantaneous feedback on changes to the VIIRS algorithms • When new evaluation measurements/retrievals become available they can be easily integrated into the evaluation system • Well-documented evaluation protocols for each VIIRS product will be created

  39. EDR Evaluation Measurement Plan • The NASA A-Train measurement platform using MODIS as a proxy for VIIRS will provide: • a platform to compare the VIIRS algorithms directly with MODIS, CALIPSO and CloudSat cloud retrievals (global) • a “baseline” for our global performance expectations for VIIRS • The assessment using ground measurements will provide well-calibrated point measurements that will be available at VIIRS launch • The combined ground/satellite evaluation using MODIS will provide a measure of how representative the ground evaluation will be in determining the global performance of the VIIRS retrievals at launch

  40. EDR Evaluation: Cloud Height Global Images (August 2006)

  41. EDR Evaluation Demonstration • Goal: To demonstrate the workflow necessary to evaluate a VIIRS atmosphere EDR for climate product quality. • Proxy Data: Aqua MODIS is the best available spectral simulation of VIIRS. • Products to be Compared: • VIIRS OPS Cloud Mask (versions 1.3 and 1.4) • MODIS operational cloud mask (collection 5) • MODIS operational cloud mask with VIIRS bands only • Components of Demonstration • Obtain Products (from archive, or generate from scratch) • Run Quality Control process on each product • Intercompare products (internally) • Validate products (using external data)

  42. Demonstration Work Plan Process one global month of Aqua MODIS proxy data (day/night), starting with Level 1A data (RDR). Run MODISL1DB algorithms for geolocation and calibration. Run DAAC operational algorithm for Level 1B destriping. Run Cloud Mask algorithms. Examine quality of each individual product (e.g., algorithm success/failure and retrieval yield statistics; processing summaries (per granule, per day); granule based images; clear radiance composites (daily, 8-day, monthly). Intercompare the products (compute and map differences in clear sky radiance statistics for final retrieval and intermediate spectral tests). Validate each product by collocating with CALIPSO lidar and comparing to CALIPSO cloud mask. Each step in the process must be straightforward and well documented.

  43. MODIS Visible Image

  44. MODIS C5 Cloud Mask

  45. VIIRS OPS v1.4 Cloud Mask

  46. VIIRS OPS v1.4 Cloud Mask (Preliminary/Unverified) VIIRS OPS v1.4 Cloud Mask Verified Anc. & IP

  47. EDR Evaluation: Satellite and Ground Measurements • Evaluate the effectiveness of proposed cloud algorithms and the resulting global cloud products generated from MODIS (proxy for VIIRS), AIRS (proxy for CrIS), Cloudsat, and CALIPSO • A subsequent test of algorithm robustness will be to apply the cloud algorithms to METOP data (AVHRR, HIRS, and IASI) for concurrent time period as A-Train data analyses • Build the capability to assess instrument issues, such as out-of-band response, channels that perform out of spec, detector striping, etc • When VIIRS is launched, it is unlikely that a space-based lidar/radar will be in operation and there will not be continuous coincident lidar/radar measurements with VIIRS • A combined satellite and ground measurement plan provides a comprehensive evaluation capability to assess the VIIRS products

  48. Evaluation: Post NPP Launch Evaluation Flow Diagram Post NPP Launch

  49. Atmosphere EDR Evaluation Summary At VIIRS launch, the Atmosphere PEATE EDR evaluation system will have the capability to: • Ingest and store global VIIRS RDRs, SDRs and atmosphere EDRs • Regenerate self-consistent SDR and EDR long-term datasets for evaluating climate quality of atmosphere EDRs • Ingest, process, and store the evaluation measurements (ground and satellite) • Collocate (space and time) the VIIRS SDRs and EDRs with the evaluation measurements (ground and satellite) • Produce quantitative comparisons between the VIIRS SDR/EDRs and the evaluation products (global, long-term) • Produce quick-look images of the VIIRS SDR/EDRs, evaluation, and collocated products • Distribute results to NPP Science Team

  50. Issues, Challenges, Concerns • EDR Algorithm Qualification/Verification • Concern: A major concern is the problems we have encountered in obtaining any meaningful test suites to verify that EDR algorithms are operating in the Atmosphere PEATE environment as intended by the algorithm designers. We must make a best guess as to what Gridded IPs and Ancillary datasets should be used as proxies in pre-launch testing. • Mitigation: Use experienced algorithm developers within the Atmosphere, Land, and Ocean PEATEs to assess and come to consensus on best pre-launch proxy datasets. • Hardware Constraints • Concern: Hardware resources allocated to the Atmosphere PEATE may not be sufficient to run meaningful pre-launch tests on VIIRS cloud algorithms if there are algorithm-related performance problems. • Mitigation: Science Processing System is designed to be able to use any resources (CPU, storage) existing within SSEC transparently. Non-PEATE CPUs can be added as needed to increase processing throughput.

More Related