Regional Haze Modeling:
This presentation is the property of its rightful owner.
Sponsored Links
1 / 30

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP PowerPoint PPT Presentation


  • 109 Views
  • Uploaded on
  • Presentation posted in: General

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP. University of California, Riverside. October 27, 2003, CMAS Annual Meeting, RTP, NC. UC Riverside : Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang Ralph Morris et al., ENVIRON Corporation

Download Presentation

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Regional haze modeling recent modeling results for vistas and wrap

Regional Haze Modeling:Recent Modeling Results for VISTAS and WRAP

University of California, Riverside

October 27, 2003, CMAS Annual Meeting, RTP, NC


Modeling team participants

UC Riverside: Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang

Ralph Morris et al., ENVIRON Corporation

Zac Adelman et al., Carolina Environmental Program

Tom Tesche et al., Alpine Geophysics

Don Olerud, BAMS

Modeling Team Participants


Acknowledgments

Western Regional Air Partnership: John Vimont, Mary Uhl, Kevin Briggs, Tom Moore,

VISTAS: Pat Brewer, Jim Boylan, Shiela Holman

Acknowledgments


Topics

Model Performance Evaluation

WRAP 1996 Model Performance Evaluation

VISTAS 2002 Sensitivity Results

CMAQ Benchmarks

Topics


Wrap modeling

1996 Annual Modeling

36 km grid for western US, 95x85x18 layers

MM5 by Olerud et al.

WRAP Modeling


Wrap emissions updates

Corrections to point sources

MOBILE6 beta for WRAP states

Monthly corrections for NH3 based on EPA/ORD inverse modeling.

Updated non-road model

Typical fires used for results shown here

1996 NEI for non-WRAP states

WRAP Emissions Updates


Regional haze modeling recent modeling results for vistas and wrap

WRAP - CMAQ revisions

  • v0301, released in March 2001

    • Used as the base case and all sensitivity cases for WRAP’s 309 simulations.

  • v0602, released in June 2002

  • v4.2.2, released in March 2003

  • v4.3, released in Sept. 2003


Regional haze modeling recent modeling results for vistas and wrap

Comparisons based on IMPROVE evaluation


Regional haze modeling recent modeling results for vistas and wrap

Model Performance Metrics

  • How well does the model reproduces mean, modal, and variational characteristics ?

    • Using observations to normalize model error & bias result in misleading conclusion:

      • if observation is very small  large bias or error

      • if model under prediction  bounded by -1

      • model over prediction is weighted more than under prediction

  • We used Mean Normalized Err & Bias in 309:

    • Poor metric for clean conditions


Regional haze modeling recent modeling results for vistas and wrap

Recommended Performance Metrics

  • Use fractional error and bias:

    • bias and error is bounded symmetrical limits of +2

  • Normalized Mean Error & Bias:

    • Divide the sum of the errors by the sum of the observations.

  • Coefficient of determination (R2)

    • explains how much of the variability in the model predictions can be explained by the fact that they are related to ambient observation, i.e. how close the points are to the observations.


Regional haze modeling recent modeling results for vistas and wrap

Statistical measures used in model performance evaluation


Regional haze modeling recent modeling results for vistas and wrap

Statistical measures used in model performance evaluation


Regional haze modeling recent modeling results for vistas and wrap

Statistical measures used in model performance evaluation

  • In addition…

    • Mean observation

    • Mean prediction

    • Standard deviation (SD) of observation

    • Standard deviation (SD) of prediction

    • Correlation variance


Regional haze modeling recent modeling results for vistas and wrap

Expanded Model Evaluation Software to include…

  • Ambient data evaluation for air quality monitoring networks:

    • IMPROVE (24-Hour average PM)

    • CASTNet (Weekly average PM & Gas)

    • STN (24-Hour average PM)

    • AQS (Hourly Gas)

    • NADP (weekly total deposition)

    • SEARCH

  • 17 statistical measures in model performance evaluation

  • All performance metrics can be analyzed in an automated process for model and data selected by:

    · allsite_daily · onesite_daily

    · allsite_yearly · onesite_monthly

    · allsite_monthly · onesite_yearly


Community model evaluation tool

Facilitate model evaluation.

Benefit from shared development of tool.

Share monitoring data.

UCR software available at website:

www.cert.ucr.edu/aqm

Community Model Evaluation Tool?


Regional haze modeling recent modeling results for vistas and wrap

WRAP 1996 Evaluation, CMAQ v4.3


Regional haze modeling recent modeling results for vistas and wrap

WRAP 1996 Evaluation, CMAQ v4.3


Regional haze modeling recent modeling results for vistas and wrap

WRAP 1996 Evaluation, CMAQ v4.3


Regional haze modeling recent modeling results for vistas and wrap

WRAP 1996 Evaluation, CMAQ v4.3


Wrap 1996 cases in progress

New fugitive dust emissions model

New NH3 emissions model

Actual Prescribed & Ag burning emissions

2002 annuals simulations being developed.

WRAP 1996 cases in progress


Vistas model 12 km domain

34 L MM5 by Olerud

1999 NEI

CMAQ v3

VISTAS Model 12 km Domain


Vistas sensitivity cases

3 Episodes: Jan 2002, July 1999, July 2001

Sensitivity Cases

MM5 MRF and ETA-MY,

PBL height, Kz_min, Layer collapsing

CB4-2002

SAPRC99

CMAQ-AIM

GEO-CHEM for BC

NH3 emissions

VISTAS Sensitivity Cases


Vistas key findings

NO3 over predictions in winter, under predictions in summer.

Thorton et al N2O5 had small benefit, July MNB increased from –50% to –45%

SO4 performance reasonably good

Problems with PBL height

Kz_min = 1 improved performance

Investigating PBL height corrections

Minor differences in 19 vs 34 layers

VISTAS Key Findings


Benchmarks

Athlon MP 2000 (1.66 GHz)

Opteron 246 (2.0 GHz)

32 bit code

64 bit code

Compare 1, 4 and 8 CPUs.

Ported CMAQ to the 64 bit SuSE

Pointers & memory allocation for 64 bit

Benchmarks


Test case for benchmarks

VISTAS 12 km domain

168 x 177 x 19 layers

Benchmarks for CMAQ 4.3

One day simulation, CB4, MEBI

Single CPU run time hour:minutes

Athlon 2 GHz: 14:10

Opteron 32bit 2 GHz: 12:49

Opteron 64 bit 2 GHz: 10:57

Test Case for benchmarks


Optimal cost configuration

Small cluster < 8 CPUs use Athlon

Large cluster >16 CPUs use Opterons?

Optimal Cost Configuration


Conclusions

Major Improvements in WRAP 1996 Model

WRAP 2002 annual modeling underway

VISTAS Sensitivity Studies

still have problems in NO3

Need better NH3 inventory

Need more attention to PBL heights in MM5

Community model evaluation tool?

Conclusions


  • Login