slide1
Download
Skip this Video
Download Presentation
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations

Loading in 2 Seconds...

play fullscreen
1 / 29

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations - PowerPoint PPT Presentation


  • 124 Views
  • Uploaded on

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations. T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen -- UC Riverside Patricia Brewer -- VISTAS Technical Coordinator James Boylan – Georgia Dept of Natural Resources Models-3 CMAS Conference

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations' - gomer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations

T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC

Ralph Morris -- ENVIRON

Gail Tonnesen -- UC RiversidePatricia Brewer -- VISTAS Technical Coordinator

James Boylan – Georgia Dept of Natural Resources

Models-3 CMAS Conference

18-20 October 2004

Chapel Hill, NC

outline
Outline
  • VISTAS objectives
  • Model set-up for initial Phase II runs
  • Highlights of CMAQ/CAMx evaluations
    • Operational, Comparative, Diagnostic, Mechanistic
  • Some findings from diagnostic studies
  • Suggestions
vistas aq modeling objectives
VISTAS AQ Modeling Objectives
  • Phase I:
    • Evaluate suite of models for episodic and annual simulation of Regional Haze & PM2.5 on 36/12 km US grid
  • Phase II:
    • Select and evaluate preferred model(s) for 2002 annual period via detailed model performance and sensitivity evaluations
    • Evaluate emission control strategies for regional haze, particularly for VISTAS region.
    • Support VISTAS states responsible for upcoming PM2.5 attainment demonstrations.
model set up for initial 2002 annual run
Model Set-up for Initial 2002 Annual Run
  • 36/12 km grid, 19 layers
  • CMAQ v4.3 and CAMx v4.0
  • MM5 (Pleim-Xiu_ACM8 36/12 km)
  • 2002 Emissions for VISTAS states (WRAP and CENRAP updates; NEI 1999 V2 for rest of U.S.)
  • CMAQ (CB4, SORGAM); CAMx (CB4, SOAP)
  • BCs from 2001 Seasonal GEOS-CHEM
  • Models run in 4 quarters with 15 day spin-up
  • VISTAS Phase II Modeling Protocol followed
  • For reports, results, presentations….

http://pah.cert.ucr.edu/vistas/vistas2/reports

operational evaluation
Operational Evaluation
  • Focus on
    • Visibility-related PM species
    • Identify needed improvements before final 2002 basecase simulations begin (next week…!)
  • Use suite of 15 metrics and graphical tools
  • Evaluate by month and monitoring network
  • Multiple evaluation teams
    • ENVIRON, UCR, Alpine, VISTAS-TAWG, GA-DNR
sulfate fractional bias and error cmaq note scale 0 100
Sulfate Fractional Bias and Error: CMAQ(note scale: 0-100%)

IMPROVE Data for VISTAS States:12 km grid

nitrate fractional bias and error cmaq note scale 0 200
Nitrate Fractional Bias and Error: CMAQ(note scale: 0-200%)

IMPROVE Data for VISTAS States:12 km grid

slide9

Bias as Function of Concentration: CMAQ

Data for VISTAS States:12 km grid

operational evaluation summary for cmaq camx
Operational Evaluation Summary for CMAQ & CAMx
  • Good: SO4 and EC
  • Good-Fair: PM2.5 and PM10
  • Fair: NH4
  • Fair-Poor OC and CM
  • Poor NO3 and Soils
comparative evaluation
Comparative Evaluation
  • Inter-compare CMAQ V4.3 and CAMx V.4
  • Use identical SMOKE/MM5 inputs & VISTAS evaluation protocol
  • Examine reasons for similar and divergent behavior
    • Gas phase and aerosol species
    • Wet and dry deposition patterns
  • Conduct sensitivity experiments to elucidate similar and divergent behavior in CMAQ and CAMx
comparative evaluation summary
Comparative Evaluation Summary

In general: CMAQ and CAMx respond consistently for most gas-phase and PM species

Winter: Large over-predictions of NO3 and CM

Summer: Large under-predictions of NO3 (but concentrations are quite small)

All Seasons: Soils over-predicted; OC under-predicted (understated primary OC emissions?)

diagnostic evaluation
Diagnostic Evaluation
  • Examine PM and gas-phase species by network
  • Evaluate effects of grid resolution, model response by sub-region, and range of time scales
  • Examine differences in CMAQ/CAMx response
  • Synthesize CMAQ/CAMx model evaluation results to elucidate possible sources of model bias and error (e.g. formulation, inputs, …)
seasonal annual average aerosol bias and error cmaq
Seasonal & Annual Average AerosolBias and Error: CMAQ

IMPROVE Data for VISTAS States:12 km grid

spatial mean nitrate vistas vs mane vu

CMAQ

Spatial Mean Nitrate: VISTAS vs. MANE-VU

VISTAS: Jan ‘02

MANE-VU Jan ‘02

VISTAS: May ‘02

MANE-VU May ‘02

spatial mean sulfate vistas vs mrpo

CMAQ

Spatial Mean Sulfate: VISTAS vs. MRPO

VISTAS: Jan ‘02

MRPO Jan ‘02

VISTAS: May ‘02

MRPO May ‘02

spatial mean ec dry deposition
Spatial Mean EC Dry Deposition

CMAQ-Jan ’02

CAMx-Jan ’02

CMAQ-Jul ’02

CAMx-Jul ‘02

CMAQ dep >CAMx depfor EC

spatial mean cm dry deposition
Spatial Mean CM Dry Deposition

CMAQ-Jan ’02

CAMx-Jan ’02

CMAQ-Jul ’02

CAMx-Jul ‘02

CMAQ dep <<CAMxdepfor EC

diagnostic evaluation summary
Diagnostic Evaluation Summary

-CMAQ and CAMx consistent for most species across all domains and time scales.

- EC/CM bias ‘flip-flop’ due to different dry deposition algorithms in CMAQ/CAMx

  • OC bias differences in CMAQ/CAMx, in part, attributed to
    • Different SOA chemistry formulations
    • Different environmental chamber data sets and parameterizations.
mechanistic evaluation cb4 vs saprc99 for jan 02 jul 01 episodes
Mechanistic Evaluation: CB4 vs SAPRC99 for Jan ’02 & Jul ’01 Episodes
  • Very Similar Base Case Performance for SO4, NO3 and OC:
    • Differences between 36 and 12 km grid larger than differences between CB4 and SAPRC
    • SAPRC exhibits slightly improved performance for ozone compared to CB4
  • Generally Similar Response to 30% Controls, except:
    • SO4 sensitivity to NOx controls
      • SAPRC approximately twice as sensitive
      • Tied to H2O2 and O3 sensitivity to NOx controls
    • O3 sensitivity to VOC
      • SAPRC more sensitive than CB4
three suggestions
Three Suggestions
  • Devote greater emphasis to the diagnostic component of MPE (consider range of time and space scales, super-site data sets)
  • Utilize the extensive 2002 aircraft data base for aloft model evaluation (probe ‘regional transport’ issue)
  • Employ corroborative models to explore key uncertainties in
    • Input data base development
    • Base case model performance
    • Reliability of model response to emission controls
ad