1 / 9

Sarav Arunachalam Institute for the Environment University of North Carolina at Chapel Hill

Improving an Air Quality Decision Support System through the Integration of Satellite Data with Ground-Based, Modeled, and Emissions Data DSS Performance Metrics. Sarav Arunachalam Institute for the Environment University of North Carolina at Chapel Hill Presented at the

fisk
Download Presentation

Sarav Arunachalam Institute for the Environment University of North Carolina at Chapel Hill

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving an Air Quality Decision Support System through the Integration of Satellite Data with Ground-Based, Modeled, and Emissions DataDSS Performance Metrics Sarav Arunachalam Institute for the Environment University of North Carolina at Chapel Hill Presented at the NASA ROSES DSS Design Workshop Cooperative Institute for Research in the Atmosphere Ft Collins, CO May 12-13, 2008

  2. Background – NASA Reporting Requirements – (1 of 2) • At least 2 Demonstrations and Prototypes • One demonstration should show the configuration and performance of the prototype • One demonstration should show the use and performance of the enhanced DSS in the end users’ operational environment • Quarterly Reports: One-page chart with four sections • Major accomplishments and performance measures • Major milestones • Risks, risk mitigations, and project metrics • Issues and action items to address

  3. Background – NASA Reporting Requirements – (2 of 2) • Final Report • System configuration diagram • Results of feasibility evaluation(s), issues encountered and resolved • Integration issues and interoperability issues encountered and resolved • Robust documentation of procedures and guidelines describing the steps to access, integrate, and utilize the Earth science research results • Lessons learned • Resource estimate for the user organizations’ adoption and sustained use of the Earth science products • Recommendations and remaining issues facing the sustained use of the Earth science data in the enhanced DSS by the partner agency and end users • Analysis of DSS performance with the Earth science results compared to the baseline • Quantitative and qualitative enhancements to the DSS and related decision-making • Quantitative and qualitative socio-economic benefits (actual or estimated) from the improved decision-making enabled by the project and enhanced DSS

  4. Develop Performance Measures: Examples • Focus on 2 broad types • Effectiveness • Achievement of end-user agency mission

  5. Develop Performance Measures: Effectiveness • Measures based upon Information Content • Content • Accuracy • Extensiveness • Usefulness • Search Capability • Use policy? • Measures based upon Ease of Use • Overall Design • Ease of navigation • Can the end-user find things needed in a timely manner? • Mechanism for end-user feedback • Accessibility • Speed • Ease in performing analyses • Statistical, graphical, tabular reports

  6. Develop Performance Measures: Achievability of Agency Mission • Time taken to support decision-making • Performing analyses • Data downloads • Others?

  7. Develop Performance Measures: How to? • Generate questionnaire to survey AQ Planners on defining baseline and probable performance measures • WRAP and VISTAS RPO mailing lists • EPA Offices • Other? • Think of Emissions, Monitoring and Model Data • Target all applications of air quality management • Develop SIPs, FIPS • Real-time forecasting • Regulatory Impact Analyses • Other? • How to quantify expertise level of user • Distinguish between baseline and enhanced phases of DSS • Others?

  8. Develop Performance Measures: How often? • How to define benchmarks of progress • Repeat survey, and see how things have improved / changed for end-users • Target other planned conference/meetings, and try to document progress • EPA organized State/Local Modelers Meeting? • RPO Modelers meetings • Solicit feedback from end-users at these conferences • Plan for Year 2 meeting of end-users • Funding constraints may mean teleconference instead of face-to-face meeting • Others?

  9. Questions / Discussion

More Related