1 / 47

Refining Real-Time QC Criteria for Ozone Monitors

Refining Real-Time QC Criteria for Ozone Monitors. Duc Nguyen, Senior Meteorologist Presented at: National Air Quality Conferences San Diego, CA March 8, 2011. Problem & Objective.

armandon
Download Presentation

Refining Real-Time QC Criteria for Ozone Monitors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Refining Real-Time QC Criteria for Ozone Monitors Duc Nguyen, Senior MeteorologistPresented at:National Air Quality ConferencesSan Diego, CAMarch 8, 2011

  2. Problem & Objective Problem: Several false air quality alerts were issued in the DC-MD-VA forecast regions in 2010 due to erroneous data. An evaluation of the QC checks indicated that the limits/criteria were not refined enough to catch the “bad” data. Solution: Refine the QC criteria to better handle the potentially bad data. General: The refined QC criteria will be derived based on monitoring data during the most recent 5 years (2005-2009) and/or10 years (2000-2009). (Objective) Approach

  3. Background on QC Checks • 4 basic QC checks: minimum, maximum, sticking check and rate of change. • Maximum Checks • Absolute limit  automatic invalidation (QC-Code 9) • Conditional limit  manual evaluation (QC-Code 5) • Data that pass all QC checks and suspect data (not evaluated in time) are used for real-time mapping, reporting and air quality alerts.

  4. Edgewood Furley Evaluating Current QC Criteria Sample QC Criteria in Database • Potential erroneous data ‘window’ (i.e. area above observed data and maximum invalidation limit) is too large auto-invalidation potentially catch little or no bad data. • Rate of change appears to be too low potentially filter out ‘Good’ data. • Some in-correct data entry for QC criteria (e.g. duplicate QC code 9 [invalid] instead of having both 5 [suspect] and 9. • New sites added in recent years have more refined QC criteria but not refined enough.

  5. CRITERIA FOR REFINING QC PARAMS • Invalid minimum: change to -1 ppb. • Invalid maximum: daily average of hourly difference in ozone concentrations between2000-2009 and 2005-2009. (If not available or difference < 5 ppb, use state-wide average). • Suspect threshold: five-year hourly maximum(2005-2009). • ROC: ten-year maximum hourly ROC(2000-2009). • Sticking check threshold: unchanged at 40 ppb.

  6. 1-page dashboard to interactively provide quick visual display old vs. new QC criteria based on site selection. Tool can be shared with other agencies. Technique can be adopted for other pollutants. QC Criteria Dashboard Excel Dashboard

  7. Dashboard Helps Pin-Point Outliers • Data used for refining the QC criteria are extracted from AQS database [Which are suppose to be FINAL.]  Fundamental assumption. • Dashboard helps pin-point outlier data in several states in Mid-Atlantic (DC, DE, MD, VA & WV). • These data NEED to be evaluated and removed if necessary for the refined QC criteria to work in real-time. Automatic checks are only as good as the defined criteria!

  8. PG Equestrian Center (1 of 2) Highly Suspect

  9. PG Equestrian Center (2 of 2) • Suspect ROC is caused by highlighted data. • occurred during a power failure with other instruments must have affected ozone instrument too. • Data to be invalidated.

  10. Rockville (1 of 2) Flat MaxHIGHLY Suspect

  11. Rockville (2 of 2) • “Flat” max data is caused by highlighted data. • Bad data caused 2 exceedances of 75 standard. • Turned out to be an oversight.

  12. May 30, 2007 (1 of 3) Hourly Max & ROCHIGHLYSuspect

  13. May 30, 2007 (2 of 3) • Outlier data at HU-Beltsville and Hagerstown are highly suspect. • Data at Shenandoah NP seem to suggest possible mixing?

  14. May 30, 2007 (3 of 3) Reading just before power failure  to be Invalidated Power failure & Calibration  to be Invalidated

  15. Fairhill (1 of 4) Highly Suspect … caused several dates

  16. Fairhill (2 of 4) • High readings at Fairhill were caused by auto-calibration during April 6-13, 2007 to be invalidated. • Data at 2 other sites also had extreme outliers too. • Spike at Essex likely caused by PC checks (confirmed by Field Operations staff). • Spike at P.G. Equestrian occurred during a power failure with other instruments  must have affected ozone instrument too (slide #9). • Both to be invalidated.

  17. Calibration  to be Invalidated Fairhill (3 of 4) NJ-PA DC-MD-VA • Highly suspect ozone profile at Fairhill. Outside the norm for all sites in Mid-Atlantic. • Transport ozone? • Unique localized event? • “Typical weather” couldn’t have done it alone. • Based on my experience, no event like this ever occurred. • Another value at Padonia(Calibration  invalid).

  18. Fairhill (4 of 4) • Hand-written note reveal important clue about instrument’s mal-function. • Precision check failed onAugust 4, 2003. • Data to be invalidated back to last “OK” precision check(July 30, 2003).

  19. September 7, 2010 (1 of 2) Spotted by ROC & Dashboard 19 ppb at 11 EST at Rockville appears suspectBut CAN’T invalidate for that reason alone!

  20. 5-Minute Data 1-Hour Data September 7, 2010 (2 of 2) Data status by DR DAS “<S” for under 75% “Do” for down. • No valid 5-minute samples for 11AM block. • Further investigation by DR DAS indicated that data were collected erroneously just prior to power failure (i.e. software issue). • Envitech will look into it to prevent it from happening again.

  21. Tacoma McMillian River Terrace Suitland Ozone Scavenging Tend to be located at “urban core.”!!! Good Data !!!

  22. Examples of PossibleOzone Scavenging !!! Good Data !!!

  23. Maryland’s Data Evaluation and Disposition ROC Criteria1990s: -60< ROC >602000s: -50< ROC >50Tableau Data: 1990-2010

  24. Outlier Data Unique to Maryland? • NO!!! • 6 other Mid-Atlantic states investigated (DC, DE, NJ, PA, VA & WV) have the similar issues. • 2 states outside of Mid-Atlantic (CA & NY) have similar issues. • “Outlier” ozone data are likely more widespread in AQS across most states. What about other pollutant data in AQS?

  25. DC, DE, VA & WV Data ROC Criteria (excludes 510690010) -55< ROC >55

  26. NJ & PA Data ROC Criteria-60< ROC >60

  27. New York Data (-60< ROC >60)

  28. California Data ROC Criteria-70< ROC >70

  29. Tableau Public (limit to 100k records) could be used to visualize outliers! DC, DE, VA & WV Data: 2000-2010 NJ-PA Data: 2000-2010NY Data: 2000-2010 CA Data: 2005-2008 | 2009 | 2010

  30. Examine Data in AIRNowTech • Data management/QC staff noticed that “GOOD” data in AIRNowTech are sporadically invalidated. • The invalidated data sometimes change the daily maximum 8-hour ozone concentrations significantly. • A thorough examination indicated that the invalidated data were caused by ROC being too low (i.e. 20 ppb for ozone monitors in Maryland.)

  31. Case 1: May 27, 2010 84 ppbROC: > 20 47 ppbROC: > 20 78 & 58 ppbROC: ≥ 20 Instrument Error 45 & 59 ppb 45, 43 & 44 ppbROC: > 20 52 ppbROC: > 20 49 ppbROC: > 20 Data wouldn’t have been removed if recommended QC Criteria were implemented!

  32. Case 2: May 21, 2010 • Data highlighted in yellow were invalidated automatically by ROC. • Data wouldn’t have been removed if recommended QC Criteria were implemented! • Invalidated data caused daily peak 8-hour ozone at Rockville to exceed the 2008 standard. It should be 73 ppb.

  33. !!! CAUTION !!! • Be extremely careful with data evaluation process and SHOULD only remove data that are “OBVIOUSLY” bad or couldn’t be right! • Things to consider: • High value during known auto-calibration hour(s) • Leading zero’s • First observation post an extended missing data period. • Human error • Winter-time data • ROC is very valuable but shouldn’t be the only method. • Complex terrain • High-elevation monitors as surrogates for night-time regional concentrations. • Unique meteorological conditions (thunderstorms, bay/land breeze, subsidence, etc.) • Have fun because you will learn something new! • Others?

  34. Summary • Current QC criteria in AIRNow are not refined. • Extreme outliers (almost certainly bad) are missed. • Low ROC’s invalidate “Good” data • Proposed QC criteria will greatly reduce problems. • 1-page dashboard quickly display proposed vs. old criteria. • Easily adopted by other agencies. • Can be adopted for other pollutant data. • This work helps pin-point extreme outlier data in AQS database due to various reasons.

  35. Recommended Actions • QC criteria need be re-examined and updated by EPA or agency responsible for the data. • Re-instate the ability for agencies to update QC criteria via AIRNowTech. (On hold due to funding). • Data in AQS should be re-examined: • Bulk evaluation of pollutant data in AQS to identify outliers (EPA?). • Agencies responsible for data will evaluate outliers identified by EPA and update if necessary. • EPA should survey data management/QC staff at local/state/tribal air agencies to understand why extreme outlier data (most of which should lead to automatic invalidation?) were not invalidated and take corrective actions to avoid this from happening in the future. • Is it related to software generating data for submittal to AQS? • Is it related to not having interactive and visual analysis tools to easily evaluate data? • Is it related to staff needing of training?

  36. Acknowledgements • Jessica Johnson (AIRNow DMC) • Ryan Auvil, Edwin Gluth, Jennifer Hains, John Haus, Najma Khokhar, David Krask, Laura Landry, Ewa Naylor, and Michael Woodman (MDE) • John E. White & Scott Jackson (EPA) • Robert D. Day (DC DOE) • Carolyn Stevens (VA DEQ)

  37. Contact Duc NguyenSenior Meteorologist Ambient Air Monitoring ProgramAir and Radiation Management Administration dnguyen@mde.state.md.us410-537-3275 Maryland Department of the Environment1800 Washington Boulevard | Baltimore, MD 21230410-537-3000 | TTY Users: 1-800-735-2258www.mde.state.md.usMartin O’Malley, Governor | Anthony G. Brown, Lt. Governor | Robert M. Summers, Ph.D., Acting Secretary

  38. Additional Slides • Analyses for selected case studies.

  39. September 18, 1991 Notes: Sudden drop in ozone at Southern Maryland was not unique. Likely caused by clouds inhibiting ozone formation.

  40. July 6, 1994 Notes: (a) Zero ppb at 10 EST was likely not real. (b) Sudden drop in Suitland was possible due to its proximity to “fresh” mobile emissions from DC traffic (slide #21). Small movement in this urban emission plume could lead to ozone scavenging.

  41. Frederick Airport (1 of 2) Somewhat Suspect

  42. 19 EST Frederick Airport (2 of 2) 18 EST • Data appear suspicious but there were no data flags in the raw data to indicate that they were invalid. • Satellite & radar imagery indicated widespread thunderstorms that evening. • Surprisingly, ozone didn’t drop that quickly at any other sites. !!! Valid Data !!!

  43. August 3, 2002 Local ‘Washout’Overnight Previous Day Ozone(Aug 2, 2002) Late afternoonThunderstorms Thunderstorms didn’t mean much back in the day as long as upwind concentrations were high the prior day and there’s some sunshine for ozone formation.

  44. August 14, 2002 Effect ofOvernightShowers

  45. 1 EST 1 EST 4 EST Subsidence (September 11, 2002) Washout for monitors influenced by heavy rain TS Gustav Sinking - BlueRising - Red

  46. Sinking - BlueRising - Red June 23, 2005 Early onset of ozone for sites near frontal boundary due to solenoidal effect No reasonto invalidate 1 EST Frontal passage& light rainovernight

  47. 15:00 EST 16:00 EST August 4, 2005 Outflow boundarycaused huge spikesat Fairhill, MD& Lums 2, DE

More Related