Skip this Video
Download Presentation
Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007

Loading in 2 Seconds...

play fullscreen
1 / 14

Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007 - PowerPoint PPT Presentation

  • Uploaded on

AMS/EPA Workshop on the Evaluation of Regional-Scale Air Quality Modeling Systems: Overview & Next Steps. Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007. Evolving US Air Quality Management System. Source: John Bachmann, EM Magazine, June 2007. Steering Committee Members.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about ' Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007' - vevina

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

AMS/EPA Workshop on the Evaluation of Regional-Scale Air Quality Modeling Systems: Overview & Next Steps

Tyler Fox, USEPA

6th Annual CMAS Conference

October 1, 2007

evolving us air quality management system
Evolving US Air Quality Management System

Source: John Bachmann, EM Magazine, June 2007

steering committee members
Steering Committee Members
  • S.T. Rao
  • Alice Gilliland
  • Kenneth Schere
  • Robin Dennis
  • Dr. Steven Hanna
  • John S. Irwin
  • Christian Hogrefe
  • Prof. Douw Steyn
  • Prof. Montserrat Fuentes
  • Prof. Akula Venkatram
  • Christian Seigneur
  • Rich Scheffe
  • Tyler Fox
workshop objectives
Workshop Objectives
  • Discuss approaches to advance process-level evaluations of meteorology, emissions, chemistry, and chemical-transport modeling.
  • Discuss and develop approaches to advance air quality model evaluation methods and procedures.
  • Develop specific recommendations for model evaluations from air quality management and forecasting perspectives.
keynote topics
Keynote Topics
  • Evaluating performance of
    • meteorological processes within air quality modeling systems
    • source and sink processes within air quality modeling systems
    • chemistry and aerosol processes within air quality modeling systems
  • Methods and processes for evaluating the performance of air quality modeling system components
some discussion items
Some discussion items
  • Some of most important MET variables for AQ modeling are not routinely and reliably evaluated (e.g., PBL depth & cloud properties)
  • Discussed a number of model probing tools:
    • Source apportionment & receptor modeling
    • Sensitivity analysis & process analysis
  • Measurements are critical so imperative for model developers/users be involved in design of monitoring networks and special field studies.
  • Model evaluation methods specific to the context of the application (i.e., fit for purpose).
  • Purpose of evaluation?
    • Acceptance for an application
    • Guide and influence further modeling system improvements



MET Processes

PBL Height




Potential to Improve Model Performance

Wet Removal


SO4 AqChem




OC Aging



Gas Mechs


Temp, RH



Availability of Lab or Field Measurements



Processes Affecting Modeled PM2.5

Source: Prakash Bhave



Operational Evaluation

How do the model predicted concentrations compare to observed concentration data?

What are the overall temporal or spatial prediction errors or biases?


concentration and deposition

Model Inputs: meteorology and emissions

Chemical transformation: gas, aerosol,

and aqueous phases

Transport: advection and diffusion

Removal: dry and wet deposition

Are we getting

the right answers?

Can we capture observed changes in air quality?

Dynamic Evaluation

Can the model capture changes related to meteorological events or variations?

Can the model capture changes related to emission reductions?

Can we identify needed improvements for modeled processes or inputs?

Are we getting right answers for right (or wrong) reasons?

Diagnostic Evaluation

Are model errors or biases caused by model inputs or by modeled processes?

Can we identify the specific modeled process(es) responsible?

Probabilistic Evaluation

What is our confidence in the model-predicted values?

How do observed concentrations compare within an uncertainty range of model predictions?

What is our confidence in the model predictions?

epa modeling guidance for sip demonstrations
EPA Modeling Guidance for SIP Demonstrations
  • In April 2007, EPA released final version of Guidance on the Use of Models and Other Analyses for Demonstrating Attainment of Air Quality Goals for Ozone, PM2.5, and Regional Haze
    • Chapter 18 “What are the procedures for evaluating model performance and what is the role of diagnostic analysis?”
    • Appendix B “Summary of recent model performance evaluations”
  • Available at:


Morris, R., et al., “Model and Chemistry Inter-comparison: CMAQ with CB4, CB4-2002, SAPRC99”, National RPO Modeling Meeting, Denver, CO, 2005b.

• Based on US (36-km) and VISTAS (12-km) January 2002 modeling, conducted

chemistry mechanisms inter-comparisons for CMAQ with CB4, CB4-2002,and


• The performance of CB4 and CB4-2002 was similar for PM, and superior to SAPRC99 overall (for the Jan02 case).

• The model performance for CMAQ/CB4, US 36-km domain is in the range of:

Sulfate: MFE = 42% ~ 73%, MFB = -21% ~ +14%

Nitrate: MFE = 62% ~ 105%, MFB = -21% ~ +46%

Organic: MFE = 50% ~ 77%, MFB = +3% ~ +59%

EC: MFE = 59% ~ 88%, MFB = +2% ~ +70%

Soil: MFE = 165% ~ 180%, MFB =+164% ~ +180%

PM 2.5: MFE = 48% ~ 88%, MFB = +25% ~ +81%

• Given that the computational cost of SAPRC99 is twice that of CB4, suggested to use 36 and 12 km grids with CB4 chemistry for PM modeling for the time being.

• Noted that both CB4 and SAPRC underpredicted winter O3 significantly.

workshop next steps
Workshop Next Steps
  • The workshop Steering Committee is currently preparing a manuscript, summarizing the recommendations of the workshop participants, for publication in the Bulletin of the American Meteorological Society
  • Conduct follow-on workshop(s) in 2008 to discuss the results of the applications of the recommended methods and lessons learned.
expected outcomes
Expected Outcomes
  • Promote dialogue across community to gain better understanding and ultimately agreement on “accepted” evaluation methods and techniques
  • Builds confidence in the use of regional-scale air quality model outputs for air quality management and air quality forecasting purposes.
evolving us air quality management system1
Evolving US Air Quality Management System

Source: John Bachmann, EM Magazine, June 2007

challenges ahead
Challenges Ahead
  • SIP Modeling for Attainment Demos
    • Dynamic evaluations of model responsiveness
  • Public Health and Exposure Assessments
    • Improve air quality characterization for health studies at local and neighborhood scales
  • Integrated, Multi-Pollutant AQM Planning
    • “One-atmosphere” modeling to better inform control strategy development & more comprehensive planning
  • Climate-Air Quality Linkages
    • Link climate and regional modeling systems to address feedbacks on emissions, meteorology, and chemistry.