1 / 26

Quality Assurance of Upper Air Meteorological Data

Quality Assurance of Upper Air Meteorological Data. Robert A. Baxter, CCM Parsons Engineering Science Pasadena, CA. Upper Air QA Overview. Field program QA Scope of QA during the data collection effort Implementation of the field QA program Overall results of the field QA effort

hobby
Download Presentation

Quality Assurance of Upper Air Meteorological Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance of Upper Air Meteorological Data Robert A. Baxter, CCMParsons Engineering SciencePasadena, CA

  2. Upper Air QA Overview • Field program QA • Scope of QA during the data collection effort • Implementation of the field QA program • Overall results of the field QA effort • Data validation QA • Unresolved issues or unprocessed data • Post processing algorithm analysis associated with data validation • Lessons learned from overall program

  3. Field Program Quality Assurance • Review candidate monitoring sites and aid in the site selection process • Perform system and performance audits early in the program to aid in early identification and correction of any potential problems • Assess the accuracy of the data collected

  4. Field Program Quality Assurance Overall scope • Candidate site reviews (16 sites) • Systems audits (26 stations) • Performance audits- 25 surface meteorological stations- 4 sodars- 10 radar profilers/RASS systems • Assess overall data quality from surface and upper air measurements

  5. Field Program Quality Assurance Equipment and variables audited • Radar wind profilers with RASS- NOAA/ETL 915 MHz three axis- Radian 915 MHz phased array • Sodars- NOAA/ETL two axis- Radian phased array- AeroVironment three axis • Surface meteorology (WS, WD, T, RH)

  6. Site Locations

  7. Overall Field Audit Schedule

  8. Candidate site reviews System audits Performance audits Audit Activities

  9. Candidate site reviews System audits Performance audits Exposure for measurements Noise sources RF analysis AF analysis Power, security and communications Compatibility with neighbors Suitability for measurements Suitability for audit instrumentation Assessment of appropriate beam directions Audit Activities

  10. Candidate site reviews System audits Performance audits System audit checklist Observables, equipment, exposure, operations Procedures, training, data chain of custody Preventive maintenance Site vista evaluation Orientation, level Picture documentation Operating environment Background noise Potential interfering sources Audit Activities

  11. Sample Picture Documentation

  12. Candidate site reviews System audits Performance audits (surface) Wind speed Response Starting threshold Wind direction Alignment to true north Response starting threshold Temperature Relative Humidity Audit Activities

  13. Candidate site reviews System audits Performance audits (upper air) Radar wind profiler (10 sites + ARB) portable sodar rawinsonde RASS (10 sites + ARB) rawinsonde Sodar (4 sites) simulated winds using APT Audit Activities

  14. Field QA Data Analysis and Reporting • In field evaluation • preliminary surface and upper air results • system audit review • same day reaudits for any correctable deficiencies • Audit summary findings • provided by e-mail to management in ~48 hours • overall results with needed action identified • Completed audit summaries • provided by e-mail to management in ~2 weeks • detailed system and performance audit reports • Audit follow-up

  15. Field QA Overall Results • Site operational differences between contractors • Systematic problems with equipment alignment • Equipment orientation errors in the data • Differences in data QC and validation procedures between reporting groups

  16. Why are we going through these steps? What is the role of QA in the validation phase? Data Validation QA

  17. Why are we going through these steps? What is the role of QA in the validation phase? Variety of data formats and reporting conventions Questions about the post-processing algorithms Incorporation of additional data available Completion of the processing steps Data Validation QA

  18. Why are we going through these steps? What is the role of QA in the validation phase? Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Data Validation QA

  19. Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Data Validation QA

  20. Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Audit report and data header information Antenna orientation Surface vane orientation Time zone differences Reporting interval differences Data Validation QA

  21. Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Goal is to determine the most appropriate methods to process and validate data Regional site classification Coastal and offshore Inland Desert Data set comparisons _0, _1, CNS, Sonde Comparison statistics RMS, systematic diff Data Validation QA

  22. Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Review all sodar data (six sites) Determine needed post processing Vertical velocity correction Antenna rotations Algorithm corrections Interference problems (noise, reflections) Data Validation QA

  23. Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Metadata Site by site descriptors Data qualifiers (minor offsets, limitations) Pertinent information from audits and validation Data Validation QA

  24. Lessons Learned From the QA Program • On-site review of each and every station • Consistent procedures implemented by each audit group • Implementation and adherence to SOPs by all study organizations • Consistent processing procedures implemented by groups with similar data sets • Don’t shortcut the on-site documentation process in either the operations or QA

More Related