1 / 29

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations. T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen -- UC Riverside Patricia Brewer -- VISTAS Technical Coordinator James Boylan – Georgia Dept of Natural Resources Models-3 CMAS Conference

gomer
Download Presentation

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen -- UC RiversidePatricia Brewer -- VISTAS Technical Coordinator James Boylan – Georgia Dept of Natural Resources Models-3 CMAS Conference 18-20 October 2004 Chapel Hill, NC

  2. Outline • VISTAS objectives • Model set-up for initial Phase II runs • Highlights of CMAQ/CAMx evaluations • Operational, Comparative, Diagnostic, Mechanistic • Some findings from diagnostic studies • Suggestions

  3. VISTAS AQ Modeling Objectives • Phase I: • Evaluate suite of models for episodic and annual simulation of Regional Haze & PM2.5 on 36/12 km US grid • Phase II: • Select and evaluate preferred model(s) for 2002 annual period via detailed model performance and sensitivity evaluations • Evaluate emission control strategies for regional haze, particularly for VISTAS region. • Support VISTAS states responsible for upcoming PM2.5 attainment demonstrations.

  4. Model Set-up for Initial 2002 Annual Run • 36/12 km grid, 19 layers • CMAQ v4.3 and CAMx v4.0 • MM5 (Pleim-Xiu_ACM8 36/12 km) • 2002 Emissions for VISTAS states (WRAP and CENRAP updates; NEI 1999 V2 for rest of U.S.) • CMAQ (CB4, SORGAM); CAMx (CB4, SOAP) • BCs from 2001 Seasonal GEOS-CHEM • Models run in 4 quarters with 15 day spin-up • VISTAS Phase II Modeling Protocol followed • For reports, results, presentations…. http://pah.cert.ucr.edu/vistas/vistas2/reports

  5. Operational Evaluation • Focus on • Visibility-related PM species • Identify needed improvements before final 2002 basecase simulations begin (next week…!) • Use suite of 15 metrics and graphical tools • Evaluate by month and monitoring network • Multiple evaluation teams • ENVIRON, UCR, Alpine, VISTAS-TAWG, GA-DNR

  6. Monitors in VISTAS 12 km MPE Domain Yorkville, GA

  7. Sulfate Fractional Bias and Error: CMAQ(note scale: 0-100%) IMPROVE Data for VISTAS States:12 km grid

  8. Nitrate Fractional Bias and Error: CMAQ(note scale: 0-200%) IMPROVE Data for VISTAS States:12 km grid

  9. Bias as Function of Concentration: CMAQ Data for VISTAS States:12 km grid

  10. Operational Evaluation Summary for CMAQ & CAMx • Good: SO4 and EC • Good-Fair: PM2.5 and PM10 • Fair: NH4 • Fair-Poor OC and CM • Poor NO3 and Soils

  11. Comparative Evaluation • Inter-compare CMAQ V4.3 and CAMx V.4 • Use identical SMOKE/MM5 inputs & VISTAS evaluation protocol • Examine reasons for similar and divergent behavior • Gas phase and aerosol species • Wet and dry deposition patterns • Conduct sensitivity experiments to elucidate similar and divergent behavior in CMAQ and CAMx

  12. CMAQ/CAMx Fractional Error: 12 km

  13. CMAQ/CAMx Fractional Bias: 12 km EC/CM “Flip-Flop”

  14. Comparative Evaluation Summary In general: CMAQ and CAMx respond consistently for most gas-phase and PM species Winter: Large over-predictions of NO3 and CM Summer: Large under-predictions of NO3 (but concentrations are quite small) All Seasons: Soils over-predicted; OC under-predicted (understated primary OC emissions?)

  15. Diagnostic Evaluation • Examine PM and gas-phase species by network • Evaluate effects of grid resolution, model response by sub-region, and range of time scales • Examine differences in CMAQ/CAMx response • Synthesize CMAQ/CAMx model evaluation results to elucidate possible sources of model bias and error (e.g. formulation, inputs, …)

  16. CMAQ NO3 Fractional Bias: 12 km

  17. Seasonal & Annual Average AerosolBias and Error: CMAQ IMPROVE Data for VISTAS States:12 km grid

  18. CMAQ Spatial Mean Nitrate: VISTAS vs. MANE-VU VISTAS: Jan ‘02 MANE-VU Jan ‘02 VISTAS: May ‘02 MANE-VU May ‘02

  19. CMAQ Spatial Mean Sulfate: VISTAS vs. MRPO VISTAS: Jan ‘02 MRPO Jan ‘02 VISTAS: May ‘02 MRPO May ‘02

  20. Spatial Mean EC Dry Deposition CMAQ-Jan ’02 CAMx-Jan ’02 CMAQ-Jul ’02 CAMx-Jul ‘02 CMAQ dep >CAMx depfor EC

  21. Spatial Mean CM Dry Deposition CMAQ-Jan ’02 CAMx-Jan ’02 CMAQ-Jul ’02 CAMx-Jul ‘02 CMAQ dep <<CAMxdepfor EC

  22. SEARCH Hourly Sulfate at Yorkville, GA: Jan ‘02

  23. SEARCH Hourly Nitrate at Yorkville, GA: Jan ‘02

  24. Yorkville NO3, Temp & Mixing Ratio Time Series (Jan ’02) Temperature Mixing Ratio NO3

  25. SEARCH Hourly Nitrogen Species at Yorkville, GA: Jan ‘02 NO NO2 NOy HNO3

  26. Bias in Hourly VISTAS Domain-Wide MM5 Fields: Jan ‘02

  27. Diagnostic Evaluation Summary -CMAQ and CAMx consistent for most species across all domains and time scales. - EC/CM bias ‘flip-flop’ due to different dry deposition algorithms in CMAQ/CAMx • OC bias differences in CMAQ/CAMx, in part, attributed to • Different SOA chemistry formulations • Different environmental chamber data sets and parameterizations.

  28. Mechanistic Evaluation: CB4 vs SAPRC99 for Jan ’02 & Jul ’01 Episodes • Very Similar Base Case Performance for SO4, NO3 and OC: • Differences between 36 and 12 km grid larger than differences between CB4 and SAPRC • SAPRC exhibits slightly improved performance for ozone compared to CB4 • Generally Similar Response to 30% Controls, except: • SO4 sensitivity to NOx controls • SAPRC approximately twice as sensitive • Tied to H2O2 and O3 sensitivity to NOx controls • O3 sensitivity to VOC • SAPRC more sensitive than CB4

  29. Three Suggestions • Devote greater emphasis to the diagnostic component of MPE (consider range of time and space scales, super-site data sets) • Utilize the extensive 2002 aircraft data base for aloft model evaluation (probe ‘regional transport’ issue) • Employ corroborative models to explore key uncertainties in • Input data base development • Base case model performance • Reliability of model response to emission controls

More Related