1 / 38

QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS

QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS. WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATION TECO-2005 C. Bruce Baker, NOAA

adamdaniel
Download Presentation

QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATIONTECO-2005 C. Bruce Baker, NOAA USA

  2. The Backbone QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS Note the stability

  3. Functions of an International/National Backbone • Infrastructure in Place for Quality measurements • Collects open access data and provides consistent quality assurance and control • Distributes data and information (via multiple paths) in real time (varies with parameter) and ensures archival • Abides by national / international standards and fosters the implementation of standards by local and regional observing systems

  4. Key Components • Management of Network Change • Parallel Testing • Meta Data • Data Quality and Continuity • Integrated Environmental Assessment • Complementary Data • Continuity of Purpose • Data and Meta Data Access

  5. VOCABULARY MANAGEMENT Documentation, Performance Measures, and Requirements PROGRAM POLICY Determined by International or National Policy and Science Driven Directives QUALITY MANAGEMENT SYSTEM Personnel, Hardware, Ingest, and Dissemination QUALITY MANUAL Requirements Documents QUALITY CONTROL Automated, Manual, Maintenance QUALITY ASSURANCE Documented Metadata, Performance Measures RESEARCH Testing, Intercomparisons, Transfer functions Overlapping Measurements IMPLEMENTATION Program Infrastructure

  6. Functional Requirements • Systems - parameters, ranges, accuracies, resolutions, expandability, design life, maintainability • Program - number of systems, cost and schedule targets, communications • Commissioning • Defines decision point – when data are official • Sustained operation, data from each site 95% of the time within one hour and/or successful entry into the archives within 30 days

  7. Configuration Management • Change management of hardware and software items, metadata management • responsibilities and procedures for CCB

  8. Test and Evaluation Phase • Conducted by Evaluation Team • Reviewed by Ad Hoc Science Working Group • Six areas Evaluated • Site Selection • Site Installation • Field Equipment and Sensors • Communications • Data Processing and Quality Control • Maintenance

  9. 5 Components of Data Quality Assurance (QA) • Laboratory Calibration • Routine Maintenance and In-Field Comparisons • Automated Quality Assurance • Manual Quality Assurance • Metadata, Metadata, Metadata • Ability to Integrate New technology

  10. Laboratory Calibration • Every sensor is calibrated before deployment to verify and improve upon the manufacturer’s specifications • Sensors are routinely rotated back into the lab from the field to be re-calibrated

  11. Routine Maintenance and In-Field Comparisons • Site Maintenance Passes • Three visits scheduled annually • Trouble Ticket or Emergency Repairs • Malfunctioning Sensor • Lightning Strike • Communication Problems • Theft and Vandalism

  12. Site Maintenance Passes Sensor Inspection Air Temperature and Humidity sensors are inspected for dust accumulation, spider webbing and wasp nests. The radiation shields of these sensors are also cleaned.

  13. Trouble Ticket or Emergency Repairs Trouble Tickets • Issued by the Data QA Manager • Priorities range from 2 to 30 business days (based on sensor) • QA Manager provides a description of the problem • Technicians complete the form with time of fix, serial numbers of sensors and a description of the repairs made • Technicians may also generate tickets in the field and submit them to the QA Manager

  14. Quality Assurance of Instruments • Documented in Anomaly Tracking System Users Manual • Reports of Incidences collected, evaluated, maintenance as needed • Metadata records updated • Quality Control Data • Documented in Data Management – Ingest to Access • Data ingest • Tests for proper message form, communication errors, etc. • Automated • Limits - Gross limits check • Variance - Limits for individual parameters • Redundancy - Data inter-comparison relies on multiple sensors • Manual -- Handbook of Manual Monitoring

  15. Metadata ManagementSurvey to Operations

  16. Ingest Raw-Data Archive Processing Internet Quality Control Flagged-Data Archive User Community Maintenance Notification offline online Maintenance Provider Access Field Sites InstrumentSuite Processing Unit Communications Device Communications Network

  17. Performance Measures 114 CONUS Geographic Locations Required • Captures 98% of variance in monthly temperature, 95% in annual precipitation for CONUS. • Average annual error <0.1ºC for temperature, <1.5% for precipitation • Trend “errors” <0.05ºC per decade • IPCC: projects warming of 0.1-0.3ºC/decade and precipitation changes of 0–2%/decade for CONUS.

  18. Determine the Actual Long-term Changes in Temperature and Precipitation of the Contiguous U.S. (CONUS) FY2005 Target: Capture more than 96.9% and 91.1% of the temperature and precipitation trends.

  19. RESEARCH

  20. Tretyakov Shield with Ott

  21. Double Alter with Geonor

  22. DewTrack MET2010 Standard RMY USCRN Shield PMT New ASOS Standard HMP243 Air Temperature & RH Monitoring At High Plains Regional Climate Center (Lincoln) ASOS MMTS CRS Gill

  23. Network Integration

  24. Cross-Network Transfer Functions Cooperative Observer Network (~10,000 Stations)

  25. Planned USCRN Stations at end of 2008 (114* stations) Installed Paired Locations Installed Single Locations * Does not Include Alaska, Canada, Hawaii, & GCOS stations As of April 26, 2005

  26. Experimental Product

  27. Siting Standards Documents Representativeness • Network Plan • Site Acquisition Plan • Site Information Handbook • Site Survey Plan • Site Survey Handbook • Site Survey Checklist • Site Acquisition Checklist

  28. Major Principles of Station Siting • Site is representative of climate of region. • Minimal microclimatic influences. • Long-term (50-100 year) land tenure • Minimal prospects for human development • Avoids agriculture, major water bodies, major forested areas, basin terrain. • Accessible for calibration & maintenance. • Stable Host Agency or Organization. • Follows WMO Climate Station Siting Guidelines

  29. Objective Site Scoring • An objective scoring sheet was developed based on the Leroy method. The score for a station becomes part of the metadata for the station • Re-scoring of stations is part of the annual maintenance visit; allows tracking time change in representativeness of station meteorology

  30. International Cooperation ,Collaboration and Partnerships • U.S Representative on the Canadian National • Monitoring Change Management Board • Canadian Reference Climate Network program participates on the USCRN Science Review Panel • USCRN hardware architecture incorporated into • Canadian Climate Monitoring Network • Two nations will exchange and co-locate reference climate stations FY04 First step in international cooperation to have commonality established for surface observing systems to monitor climate change

  31. QUESTIONS • How do we continue to expand International and • National Partnerships?? • What is the best way for the exchange of information?? • How do we glue the system of systems together?? E-Mail: Bruce.Baker@noaa.gov URL: http://www.ncdc.noaa.gov/oa/climate/uscrn/index.html

  32. Network Characteristics • Benchmark Network for temperature and precipitation • Anchor points for USHCN and full COOP network • Long-Term Stability of Observing Site (50+ years) likely to be free from human encroachment • Sensors Calibrated to Traceable Standards • Planned redundancy of sensors and selected stations • Network Performance Monitoring - Hourly and Daily • Strong Science & Research Component

More Related