1 / 9

SeaDataNet Technical Task Group meeting

SeaDataNet Technical Task Group meeting. JRA1 Standards Development Task 1.2 Common Data Management Protocol (for dissemination to all NODCs and JRA3) Data quality checking methodology Quality flag scale protocol Definition of a common tagging (identification) system

giles
Download Presentation

SeaDataNet Technical Task Group meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SeaDataNet Technical Task Group meeting JRA1 Standards Development Task 1.2 Common Data Management Protocol (for disseminationto all NODCs and JRA3) • Data quality checking methodology • Quality flag scale protocol • Definition of a common tagging (identification) system Only the first two are dealt with here

  2. SeaDataNet Technical Task Group meeting Data quality checking methodology Work plan • Review current schemes in place at NODCs • Review other known schemes (e.g. WG MDM guidelines, World Ocean Database, GTSPP, Argo, WOCE, etc.) • Data types: profile and time series data • Present first version to kick-off meeting in June • Revise on the basis of comments/feedback • Finalise draft document by end of July (β Version) • Distribute to partners for further feedback • Finalise document for end of November 2006

  3. SeaDataNet Technical Task Group meeting Data quality checking methodology Progress to date Following QC procedures/documents examined: • WG MDM Guidelines (covering 12 data types) • BODC • SISMER (and MEDATLAS) • IOC/IODE Programmes (GTSPP and GOSUD) • World Ocean Database and GODAR • Argo To be taken into consideration: • For metocean data (e.g. currents, waves, met. buoy) suggest use SIMORC Quality Control document) • For sea level data suggest use ESEAS and GLOSS documentation • Feed in information from QARTOD (Quality Assurance of Real Time Oceanographic Data) at qartod.org • Etc, etc,.

  4. SeaDataNet Technical Task Group meeting Summary list of information required to accompany data: • Where the data were collected: location (preferably as latitude and longitude) and depth/height • When the data were collected (date and time in UTC or clearly specified local time zone) • How the data were collected (e.g. sampling methods, instrument types, analytical techniques) • How you refer to the data (e.g. station numbers, cast numbers) • Who collected the data, including name and institution of the data originator(s) and the principal investigator • What has been done to the data (e.g. details of processing and calibrations applied, algorithms used to compute derived parameters) • Watch points for other users of the data (e.g. problems encountered and comments on data quality)

  5. SeaDataNet Technical Task Group meeting Examples of Automatic QC tests (based on Argo, derived from GTSPP (IOC Manuals and Guides No. 22): • Impossible date — Tests for sensible observation date and time values • Impossible location — Tests for sensible observation latitude and longitude values • Position on land — Tests whether the observation position is on land • Impossible speed — Tests for a sensible distance travelled since the previous profile. • Global range — Tests that the observed temperature and salinity values are within the expected extremes encountered in the oceans. • Regional range — Tests that the observed temperature and salinity values are within the expected extremes encountered in particular regions of the oceans. • Deepest pressure — Tests that the profile does not contain pressures higher than the highest value expected for a float. • Pressure increasing — Tests that pressures from the profile are monotonically increasing. • Spike — Tests salinity and temperature data for large differences between adjacent values. • Gradient — Tests to see if the gradient between vertically adjacent salinity and temperature measurements are too steep. • Digit rollover — Tests whether the temperature and salinity values exceed a floats storage capacity. • Stuck value — Tests for all salinity or all temperature values in a profile being the same. • Density inversion — Tests for the case where calculated density at a higher pressure in a profile is less than the calculated density at an adjacent lower pressure. • Sensor drift — Tests temperature and salinity profile values for a sudden and important sensor drift. • Frozen profile — Tests for the case where a float repeatedly produces the same temperature or salinity profile (with very small deviations).

  6. SeaDataNet Technical Task Group meeting Quality flag scale protocol Work plan • Review current schemes in place at NODCs • Review other known schemes (e.g. Argo, GTSPP, WOCE, MEDATLAS, etc.) • Data types: profile and time series data • Present to Kick-off meeting in June • Revise on the basis of comments/feedback • Finalise draft document by end of July (β Version) • Distribute to partners for further feedback • Finalise document for end of November 2006

  7. SeaDataNet Technical Task Group meeting Quality flag scale protocol Progress to date Preliminary examination of the following schemes: • BODC • SISMER and MEDATLAS • GTSPP • WOCE (CTD and water bottle) • WOCE (Surface meteorology) • WOCE (Floats) • Argo • ESEAS Next slide shows examples of flagging schemes

  8. SeaDataNet Technical Task Group meeting MEDATLAS (above) WOCE CTD (below) Flags used to describe data - SISMER 0: No quality control 1: Good 2: Probably good 3: Probably bad 4: Bad 5: Changed

  9. SeaDataNet Technical Task Group meeting Quality flag scale protocol Conclusions • Preliminary review of data quality flagging schemes shows • small variations on a theme for oceanographic data • but more complicated and detailed schemes do also exist. • The most straightforward solution would be a simple scheme perhaps comprising the following quality flags: • Not checked • Good/correct value • Doubtful/suspect value • Bad value • For some organisations this will involve mapping their schemes to this simple scheme

More Related