1 / 24

Quality Assurance Procedures for CORIE Data

Quality Assurance Procedures for CORIE Data. Quality Flags. Archival Quality Assessment. Instrument Network. Database. Web Visualization. Field Notes. Ascii Public Data Archive. Quality Flags. Archival QA Time Pressure Temperature Salinity Velocity Meteorology.

Download Presentation

Quality Assurance Procedures for CORIE Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance Procedures for CORIE Data Quality Flags Archival Quality Assessment Instrument Network Database Web Visualization Field Notes Ascii Public Data Archive Quality Flags Archival QA Time Pressure Temperature Salinity Velocity Meteorology Near Real-time Quality Assessment Field Staff Realtime QA Timeseries Diagram of Slopes Sequential Likelihood Ratio

  2. Real-time Quality Assurance • Visual evaluation of data quality 4 times a week • Automated testing for biofouling, checked by operator • Real-time checks result in notification of field staff • No database storage of notification • No incorporation of assessment into database quality flags • Non-automated use of assessment to deactivate web display of real-time data

  3. Feb Mar Apr CTD QA ADP QA ADP QA Archival quality assurance • CTD and ADP data quality assessment on a monthly basis. • 1 month lag in assessment (January data tested at the end of February) • CTD QA dependent on subsequent data • ADP QA not dependent on subsequent data • Data which passes QA is stored in ascii public archive • QA not used to generate Quality flags in database

  4. Timeseries • Timeseries of Depth, Salinity, Temperature displayed on website • Inspected for instrument failure or biofouling

  5. Cross-estuary slope diagram • S-T plot of all stations • Almost all stations should produce the same s-t line • Chnke, ogi01, and ogi02 are exceptions Mottb possibly biofouled Extensive biofouling

  6. Sequential Likelihood Ratio • Based on linear S-T relationship across estuary • Accounts for local variation from linear relationship • Depends on S and T at daily maximum S at each station, river T and Ocean T • Modeled Sclean and Sbiofouled based on T, TR, and TO compared to measured S • Station specific ratio cutoff, trained on known biofouled data • Used to generate a visual display • Currently trained for lower estuary stations • Extension of method to lateral bays under development • Could be used for archival QA 34 Scl SM Sbf 0 TR TM TO

  7. CTD: time • Radio network can produce data with bunched time values • Expected timestep between data points is determined from data (median timestep) • If timesteps are shorter than median time step, with a gap preceding bunch that has correct length, then data are reassigned times evenly spaced over gap • If gap is longer than data clump, then data clump is discarded

  8. CTD: pressure data • Pressure data is corrected for atmospheric pressure using atmospheric pressure record from marsh or tansy • Tested for spikes using a high pass filter (4th order non-causal Butterworth filter with a cutoff period of 1 hour, implemented using the matlab function idfilt) • spikes > 0.22 m removed • Period (30 minutes) around spike is removed • If tide period has extensive smaller spikes (mean noise > 0.01 m), entire period is removed

  9. CTD: temperature • Subject to range limits (<0 or >30) • Subject to visual inspection • Instrument failure has generally produced extensive invalid values

  10. CTD: Salinity • Main concern is biofouling, but Conductivity sensors can also fail • Sensor failure is detected by range check (S<0 or > 35) and by visual inspection • Biofouling is tested by using cross estuary s-t relationship • Determine median s-t slope for each tidal period

  11. CTD: Salinity • Compare each instrument’s s-t slope for that tidal period to median • Cutoff: abs(local slope) – abs(median slope) > 0.2 => biofouled • When an instrument is considered biofouled, preceding data is considered biofouled until a clean cutoff is exceeded • Clean cutoff: abs(local slope) – abs(median slope) < 0 • When median slope approaches 0, method fails • If instrument is biofouled after period of near-zero slope, then entire period of near-sero slope is considered biofouled

  12. CTD: Salinity • Automated assessment produces both false positives and false negatives • Results are manually checked Transient Biofouling False positive False negative

  13. ADP: velocity 2 major sources of bad velocity data: • Surface reflection • Low signal strength • signal strength decreases with distance from instrument • decreases with biofouling of instrument over time 1 minor source: instrument roll-over Unlike Conductivity sensors, biofouling is easily determined because signal strength is a measured variable

  14. ADP: velocity • Signal strength < 20 dB, data flagged bad • Documentation recommends a cutoff of 9 dB, but testing indicates that 9 dB cutoff admits some questionable data • Surface reflection detected by increasing signal strength • Signal strength increase can also be caused by variation in reflective material in water • Determine approximate surface from pressure record, check for signal strength increase within 3 m of approximate surface Surface Reflection Signal Strength

  15. Meteorological Data • Subject to physical range tests and visual inspection • Further methods under development

  16. Storage of Quality Assessment • Data records which do not meet minimal quality standards are stored in the raw data files, but do not enter the database • Notices of observer suspicion of data quality are not currently stored in a formal manner, and are not entered into the database • Archival quality assurance procedures currently generate public archive files which contain only data which has passed the quality assurance tests • The quality assurance flagging is not currently stored in the database

  17. End

  18. Models • A model of the clean signal • Temperature and salinity variation are correlated. Model daily maximum salinity and corresponding temperature are jointly Gaussian. • The probability density for observing the sequence of salinity measurements {sn }, given the sequence of recorded mixing coefficients {Tn }, and a cleansensor p({sn} | {Tn }, clean ) • A model of the biofouledsignal • Allows for different degradation ratesm for each biofouling episode, and arbitrary onset timet with these parameters fit to incoming data. p({sn} | {Tn }, m, t, biofouled ) = p ({sn} | {Tn }, biofouled ) • m and t are unknown • These parameters are fit to the data sequence by maximum likelihood.

  19. Output m mn gn m1 m2 g2 g1 Expert Network 1 Expert Network 2 Expert Network n Gating Network Input vector T Regression Model: Mixture of Experts • The correlation between salinity and temperatures is not stationary. • The detector system needs to switch between seasons. • A mixture of local models can cover different behaviors. • Both of experts and gating network receive same input vector. • Each expert network tackles each of the different seasons. • The gating network decides which of the experts should be used. • Regression output Ref.

  20. Approach and Results • Parameterized novelty detectors embedded in a sequential likelihood ratio test • SLR at current time N is compared to a threshold to identify biofouling events. • Results • Automated biofouling detectors deployed throughout the estuary. Monitored by observer, and used to send out notices of biofouling events, but not incorporated directly in to data flagging. Ref.

  21. Criteria for rejecting data before it enters the database • rserial2db rejects data lines based on failed checksum or garbled line Short input line: [RE^M], skipping. Skipping unknown data line: [abedCT 0000 00 00 00 00 00 1516D +20.856, +07.947, +19.0889*6F] Checksum failed for data line: W,üR'¢í?»TW%X¯»U»PT$CRdsdmaRV0CTDd00730R seabedCT 0000 00 00 00 00 00 1516D +09.502, +08.366, +08.0447*60 Short input line: [], skipping. Skipping unknown data line: [W,ýS'¢è¾?»T W%Y­»S»UT10394A141322 1316:0 746:1 :2 :3 :4 532:5 -1806:6 :7] Line length = 162, must be 81 to 83 chars long, skipping data line: 10395A138173 1193:0 770:1 :2 :3 :4 282:5 :6 10395A138177 1192:0 770:1 :2 :3 :4 278:5 :6 :7 • Most data is not subjected to sanity check (e.g salinity <0 or > 35) • Certain stations are handled as special cases and are subject to sanity checks (ogi02 is checked for negative sal, temp, and cond)

  22. Depth spikes removed

  23. Slope Comparison

  24. Salinity Flagging

More Related