1 / 51

Shawn J. Roselle NOAA Atmospheric Science Modeling Division

Application and Evaluation of CMAQ in the United States: Air Quality Forecasting and Retrospective Modeling. Shawn J. Roselle NOAA Atmospheric Science Modeling Division In partnership with the U.S. Environmental Protection Agency Research Triangle Park, NC.

baby
Download Presentation

Shawn J. Roselle NOAA Atmospheric Science Modeling Division

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Application and Evaluation of CMAQ in the United States: Air Quality Forecasting and Retrospective Modeling Shawn J. Roselle NOAA Atmospheric Science Modeling Division In partnership with the U.S. Environmental Protection Agency Research Triangle Park, NC ACCENT-CMAS Training Workshop on Air Quality Modeling, Sofia, Bulgaria August 6, 2006

  2. Summary • Background • Operational evaluation with retrospective modeling • Operational evaluation in AQ Forecasting • Future directions

  3. CMAQ EVALUATION FRAMEWORK CMAQ-predicted concentration and deposition Model Inputs: meteorology and emissions Chemical transformation: gas, aerosol, and aqueous phases Transport: advection and diffusion Removal: dry and wet deposition Operational Evaluation How do the model predicted concentrations compare to observed concentration data? Are there large temporal or spatial prediction errors or biases? Are we getting the right answers? Can we capture observed air quality changes? Dynamic Evaluation Can the model capture changes related to meteorological events or variations? Can the model capture changes related to emission reductions? Are we getting the right answers for the right (or wrong) reasons? Diagnostic Evaluation Are model errors or biases caused by model inputs or by modeled processes? Can we identify the specific modeled process(es) responsible? Probabilistic Evaluation What is our confidence in the model-predicted values? How do observed concentrations compare within an uncertainty range of model predictions? What is our confidence in the model predictions?

  4. Operational Evaluation:How well does CMAQ predict observed concentrations? (Example: Sulfate) CMAQ v4.5 (2005 release): Sulfate Predictions for Summer 2001 • Sulfate shows some of best model performance • Majority of errors are unsystematic • Ensemble simulations (discussed later) can reduce unsystematic errors • Also, log scale analyses are being tested for aerosols

  5. Diagnostic Evaluation:Using molecular measurements for source apportionment and evaluation of primary carbonaceous PM2.5 emissions CMAQ v4.3 with NEI99 v1+ Modeled / Observed Concentration

  6. Diagnostic Evaluation:Using molecular measurements for source apportionment and evaluation of primary carbonaceous PM2.5 emissions CMAQ v4.4 with NEI99 v3 Modeled ÷ Observed Concentration [Bhave, Pouliot, and Zheng, Diagnostic model evaluation for carbonaceous PM2.5 using molecular measurements in the southeastern U.S., in review.]

  7. Diagnostic Evaluation:Using molecular measurements for source apportionment and evaluation of primary carbonaceous PM2.5 emissions Summary – Southeast U.S. Notes: • excluded Birmingham data • excluded Oak Grove data • used nonanal as the only tracer (cholesterol data not available) • after reducing the natural-gas emission factor by 95%

  8. NH3 limited | HNO3 limited Diagnostic Evaluation:Gas Ratio as a diagnostic indicator for CMAQ’s nitrate replacement with SO2 emission reductions Gas Ratio (Ratio of Free Ammonia to Total Nitrate) Pittsburgh Observations CMAQ Predictions

  9. Diagnostic Evaluation: Improving Model Parameter Estimates: Wildfire Emission Specification in CMAQ • Spatial allocation of emissions • based on forest surrogates leads • to unrealistic spatial distributions • Reallocate National Emissions • Inventory (NEI) prescribed and • wildfire emissions using MODIS • Rapid Response Fire pixel count • MODIS (or Moderate Resolution Imaging Spectroradiometer) is an instrument aboard the NASA Terra (EOS AM) and Aqua (EOS PM) satellites. Base Reallocated MODIS Fire Detect Reallocation helps reduce bias and improves correlation in total carbon predictions Courtesy: D. Roy

  10. Dynamic Evaluation: Evaluating CMAQ Responses to ∆Emission Mean Daily Maximum 8hr Ozone from Summer 2002 to 2004 (NOx SIP Call) ●Modeled changes are noticeably less than observed ● Model results are strongly impacted by the use of same “clean” boundary concentrations for both summer periods (currently being analyzed) ●Actual emission reductions may have been greater than simulated due to mobile emission assumptions in these simulations (currently being analyzed)

  11. Probabilistic Evaluation: Considering Model Uncertainty and Sensitivity to Inputs and Model Options • Series of 6 meteorological simulations • Two chemical mechanisms • 12 CMAQ simulations for an ensemble case study (six runs currently available) • Ensemble case study will be used to test analysis approaches and scope out requirements • Additional factors (e.g., emissions, boundary conditions) considered for additional members of the ensemble case study

  12. PAQS Data Ensemble Average Ensembles Probabilistic Model Evaluation Application: Improving Model Performance CMAQ Ensembles and Measurement Data (PAQS=Pittsburgh Air Quality Supersite) • Ensemble average (red) has greater correlation with observations (grey) than any single member of the ensemble (blue) • Ensemble variance can capture the model uncertainty related to different modeling options used January 2002

  13. January 2002 Probabilistic Model Evaluation Application: Probability of Occurrence Example Threshold (Demonstration Only): Probability [NO3-] > 5 μg m-3 • Design ensembles to span range of scientifically reasonable model estimates • Estimate the probability of exceeding a concentration threshold based on the distribution of results • Can use approach to consider the likelihood of success of regulatory action

  14. CMAQ Retrospective Modeling: Operational Evaluation for 2001

  15. PM Monitoring Networks CASTNet: Clean Air Status and Trends Network IMPROVE: Interagency Monitoring of Protected Visual Environments SEARCH:Southern Aerosol Research and Characterization Study experiment STN:Speciation Trends Network

  16. Air Quality System (AQS)

  17. Annual 2001 Simulation: Modeling Platform • MM5-CMAQv4.5 system • CB4 chemical mechanism; AERO4 modal aerosol module • 36km×36km Continental U.S., 12km×12km Eastern U.S. nested • 14 vertical layers to 100 mb • USEPA National Emission Inventory (NEI) 2001 • BEIS3.13; Mobile6; NH3 monthly emission factor • Boundary conditions adapted from GEOS-Chem (Harvard Univ. global CTM, Bey et al., 2001)

  18. 12-km domain Maximum 8-hour Average Ozone for Summer

  19. Sulfate Aerosols

  20. Under-predictions in winter and spring Unbiased in summer Over-predictions in the fall SO4 v4.5, 12km Spring SO4 CMAQ v4.5 (12km) Winter SO4 CMAQ v4.5 (12km) Fall SO4 CMAQ v4.5 (12km) Summer SO4 CMAQ v4.5 (12km)

  21. Carbonaceous Aerosols

  22. Total Carbon Winter TC CMAQ v4.5 (12km) Spring TC CMAQ v4.5 (12km) Fall TC CMAQ v4.5 (12km) Summer TC CMAQ v4.5 (12km)

  23. Median Differential Bias*: TC, EC, and OC * median(Obs-Mod) for all sites

  24. Total Nitrate (HNO3 and NO3 aerosols)

  25. CASTNet NO3 TNO3 HNO3 Is there a problem with the N2O5 heterogeneous pathway for HNO3?

  26. Total Nitrate Spring Winter Summer Fall

  27. CMAQ v4.5 Summary Plots

  28. Soccer goal plot – Winter (v4.5)

  29. Soccer goal plot – Summer (v4.5)

  30. Summary – 2001 PM2.5 Retrospective • O3 • Underpredicts high values; overpredicts low values • SO4 • Good agreement in summer • Underpredicted in winter and spring • Overpredicted in fall • NO3 • Some overprediction in cooler months • Underpredicted in summer • TNO3 • Overpredicted in summer and fall months

  31. Summary – 2001 PM2.5 Retrospective (cont.) • Carbonaceous aerosols • TC (both EC and OC) underpredicted in spring and summer • TC overpredicted in urban areas (STN) in winter • Good agreement in the fall, and in rural areas in winter • PM2.5 – STN, IMPROVE • Overpredicted in fall and winter • Small overprediction in spring • Good agreement in summer (driven by sulfate)

  32. Air Quality Forecasting:Application and Operational Evaluation

  33. NOAA-EPA AQ Forecasting • NOAA/EPA partnership has produced operational daily AQ forecasting capability • Eta-CMAQ model system • Evolved to WRF-CMAQ system- June 2006 • Twice-daily O3 forecast guidance over Eastern U.S. at 12-km grid resolution • 2006- experimental testing – Continental U.S. domain for O3 forecast guidance • 2005, 2006- developmental testing- Eastern U.S. domain for PM2.5 forecast guidance

  34. Meteorology model Eta-12 Vertical interpolation from eta to sigma Eta Post Horizontal interpolation to Lambert grid PRDGEN CMAQ-ready meteorology and emissions PREMAQ Chemistry model CMAQ AQF Post Gridded ozone files for users Verification Tools Performance feedback for users and developers Eta-CMAQ AQF System

  35. 265 grid cells 142 259 grid cells 166 268 grid cells 442 grid cells

  36. Example CMAQ Ozone Forecast

  37. Daily Mean PM2.5 (2005) JFM AMJ Model (μg/m3) JAS OND Model (μg/m3) Observations (μg/m3) Observations (μg/m3)

  38. JFM-2005 AMJ-2005 Domain Mean Concentration OND-2005 JAS-2005

  39. PM2.5 Spatial Bias Plots - 2005 JFM AMJ JAS OND

  40. Summary Statistics - 2005Daily Mean PM2.5 (μg/m3)

  41. February 2005 – Winter PM Episode Model and Observed Daily Average Surface PM2.5 2/2/05 2/1/05 2/3/05 2/4/05 2/5/05 2/6/05 μg/m3 5 10 15 20 25 30 35 Captures hot-spots; tendency to over-predict

  42. CMAQ-AQF Model for Sulfate, Nitrate, Ammonium January - March 2005

  43. CMAQ-AQF Model for Organic Carbon, Elemental Carbon, Total Carbon, and Total PM2.5: January - March 2005

  44. Summary – Air Quality PM2.5 Forecasts • Real-time daily PM2.5 - 2005 • CMAQ overpredicts in fall/winter • CMAQ underpredicts in spring/summer • Model temporal trends generally track obs • Significant variability

  45. Summary – Air Quality PM2.5 Forecasts (cont.) • Speciated PM2.5 – winter 2005 • SO4 – smallest biases, significant variability • NO3 and NH4 – slight overprediction at CASTNet, STN; significant overprediction at SEARCH • OC and EC: underpredicted • PM2.5 – overprediction at STN, SEARCH

  46. Concluding Remarks

  47. Evaluation Activities Summary • Operational evaluation approaches provide reference for CMAQ performance and identify issues for further diagnostic testing • Current diagnostic evaluation studies target inorganic and organic aerosol predictions • Inverse modeling and receptor modeling included to consider sensitivity of PM predictions to emission inputs • Dynamic: Evaluating the model’s ability to capture concentration changes in response to changes or variations in emissions or meteorology • Probabilistic: Develop simulations and evaluation results that provide a range of results for different specifications/scenarios (e.g., ensembles) • These evaluation approaches can apply both to retrospective and forecasting applications

  48. Future Directions • Clarify data needs • Chemical and aerosol profiles in atmosphere • For evaluation and data assimilation • Global Earth Observation System of Systems (GEOSS); international coordination • Continuous PM size, composition data • Emissions data • Real-time estimates of fire and dust emissions; power plant emissions from variable loading and fuel switching • Improved primary PM inventory (size and chemical distribution) • Significant gaseous precursors (esp. NH3 and organics)

  49. Future Directions (cont.) • Boundary concentrations • Assumed climatological profiles • Adaptation of satellite data for chemical profiles • Use of global chemical models • Ensemble model simulations • Better guidance from multiple realizations? • Evaluation techniques for ensembles • Probabilistic framework

  50. Future Directions (cont.) • Integrated meteorology-chemistry modeling • Two-way communication • Improvements to both AQ and meteorology forecasts? • Archive of forecast results • Continuous record for science and policy assessments

More Related