1 / 21

PM Model Performance in Southern California Using UAMAERO-LT

PM Model Performance in Southern California Using UAMAERO-LT. Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004. Particulate Modeling in the South Coast Air Basin: Historical Perspective.

steffi
Download Presentation

PM Model Performance in Southern California Using UAMAERO-LT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004

  2. Particulate Modeling in the South Coast Air Basin: Historical Perspective • 1991 & 1994 AQMPs – Annual PM10 simulated using PIC for SO4 and NO3 with CMB and speciated rollback • 1997 AQMP – Annual PM10 simulated using UAMLC for SO4 and NO3 with CMB and speciated rollback > Simulated PM10 episode using UAMAERO • 2003 AQMP – Annual PM10 and PM2.5 using UAMAERO-LT > UAMAERO-LT developed by STI to incorporate CBIV gas chemistry and empirical partition PM model > PM partitioned into coarse and fine modes based on empirical data

  3. Establishment of Performance Criteria • No formal criteria recommended by EPA • Established 30% error margin for annual average in 1997 PM10 modeling protocol: |(predicted – observed)| / observed ≤ 30% • Error calculated for each species [NH4, NO3, SO4, OC, EC, Other (Crustal)] • Error averaged by species for PM sites simulated • Bias reviewed by species and sites simulated • Complementary quarterly analysis

  4. Performance Indicator Debate • Several indices and sub-regional analyses used for gases: > Peak predicted / observed > Bias > Error • Peak predicted / observed used for ozone in 2003 AQMP • Advisory group recommendations used RRF to assess different models/chemical mechanisms

  5. Game Plan For 2003 PM • Original concept: annual model Basin for 1995 and evaluate output for 5 speciated sites • Requested by EPA to extend analysis beyond 5 sites to enhance spatial resolution • Incorporate SSI Hi-Vol data in the analysis (evaluate simulation of PM10 mass) • Conduct grid level analyses to evaluate emissions • Conduct temporal (daily) evaluations

  6. Time Considerations • Model performance indicators for particulates need to be comprehensive because of model simulation time requirements • Annual simulation using UAMAERO-LT including set up and post processing: > Xeon Linux Dual Processor – 3 Days > 5 vertical layers, 65 X 40 grid • Speciated rollback can be used a quick confirmation analysis • Episodic simulations: variable – dependent upon chemistry and dispersion platform

  7. Questions Asked of the Annual Average PM10 Performance Evaluation • Concentration > within 30% error? >species proportions reasonable? • Are predictions at SSI sites reasonable (inferring Basin emissions totals in ballpark)? • Does spatial distribution match observations? • Are concentrations peaking during the correct seasonal? • Can emissions errors/anomalies be detected?

  8. PM2.5 Performance Evaluation:Extending the PM10 Criteria to PM2.5 • PM2.5 ratio of PM10 set by empirical analysis SpeciesPM2.5/PM10 NH4 0.90 NO3 0.74 SO4 0.80 OC 0.73 EC 0.88 Primary Variable • Use same criteria as PM10: ± 30 % absolute error for individual PM2.5 species averaged over five stations • Report bias tendency • Small concentrations exaggerate statistics

  9. Model Simulated 1995 Annual PM2.5 (ug/m3) 1995 Measured Annual PM2.5 (ug/m3)

  10. Annual PM2.5 Component Bias (ug/m3) By Station Annual PM2.5 Component Percent Absolute Error By Station

  11. Graphical Evaluation • Time Series> use PM10 analysis for estimate of PM2.5 > at least 75% PM10 is PM2.5 for each species • Evaluate SSI Hi-Vol NO3 & SO4 • Bivariate plots (Predicted vs. Observed) • Spatial Mapping > grid cell analysis above threshold > map particulate emissions

  12. Rubidoux

  13. Rubidoux

  14. Simulated 1995 Annual Average PM10

  15. 1995 Annual Average PM10

  16. Simulated 1995 Annual Average PM2.5(Same Scale as PM10)

  17. Simulated 1995 Annual Average PM2.5(Half Scale as PM10)

  18. 31.8 25.1 27.7 35.8 26.3 (1995 PTEP Sites Annual Average PM2.5 Superimposed)

  19. Other Model Performance Indicators(1995 Rubidoux As An Example) • Annual Average Ozone> Predicted 1.1 pphm > Observed 2.9 pphm • 24-hr Average Ozone> Predicted 3.3 pphm > Observed 7.4 pphm • Nitric acid > Predicted 1.9 µg/m3 > Observed 1.1 µg/m3

  20. Uncertainties Contributing to Performance Evaluation • Primary emissions are grid specific and contribute to several PM2.5 categories EC, OC, SO4 & crustal • Ammonia emissions variable • NOx impact to particulate formation non-linear • Specification of boundary conditions (2003 AQMP used monthly values)

  21. Assessment of PM10/2.5 Modeling • Need advise on rank of importance of the different performance measures to model acceptance • Need better meteorology and dispersion platform • Evaluate and export LT linear chemistry to other platforms • Evaluate full aerosol chemistry

More Related