1 / 45

UNECE Training Workshop on Dissemination of MDG Indicators and Statistical Information

Measuring and Communicating Data Quality. UNECE Training Workshop on Dissemination of MDG Indicators and Statistical Information Astana, Kazakhstan 23 – 25 November 2009 Steven Vale, UNECE. Contents. What is quality? How can we measure quality? How should we report and communicate quality?.

kaida
Download Presentation

UNECE Training Workshop on Dissemination of MDG Indicators and Statistical Information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring and Communicating Data Quality UNECE Training Workshop on Dissemination ofMDG Indicators and Statistical Information Astana, Kazakhstan 23 – 25 November 2009 Steven Vale, UNECE

  2. Contents • What is quality? • How can we measure quality? • How should we report and communicate quality?

  3. Which is the Best Quality?

  4. Definition of Quality International StandardISO 9000/2005 defines quality as; 'The degree to which a set of inherent characteristics fulfils requirements.’

  5. What Does This Mean? • Whose requirements? • The user of the goods or services • A set of inherent characteristics? • Users judge quality against a set of criteria reflecting the different characteristics of the goods or services • So quality is all about providing goods and services that meet the needs of users (customers)

  6. Quality Criteria

  7. Quality Criteria for Statistics • Different statistical organisations use different criteria- but lists of criteria are quite similar • UNECE list:Relevance Comparability Accuracy Clarity Timeliness Accessibility Punctuality

  8. Relevance • Are the statistics that are produced needed? • Are the statistics that are needed produced? • Do the concepts, definitions and classifications meet user needs?

  9. Accuracy • The closeness of statistical estimates to true values • In the past: Quality = Accuracy • Now accuracy is just one part of quality

  10. Timeliness • The length of time between data being made available and the event or phenomenon they describe Punctuality • The time lag between the actual delivery date and the promised delivery date

  11. Comparability • The extent to which differences are real, or due to methodological or measurement differences • Comparability over time • Comparability through space (e.g. between countries / regions) • Comparability between statistical domains (sometimes referred to as coherence)

  12. Accessibility • The ways in which users can obtain or benefit from statistical services (pricing, format, location, language etc.) Clarity • The availability of additional material (e.g. metadata, charts etc.) to allow users to understand outputs better

  13. Importance of Accessibility • Not just about making data available on the Internet or in a book • Passive accessibility • Accessibility is about bringing data to users in an understandable way, opening a dialogue with those users, and ensuring that their information needs are met • Active accessibility

  14. Accessibility Should Include: • Communicating • Marketing • Interpreting • “Story-telling” • Informing • Educating

  15. Accessibility and Visualization • Good visualizations make data accessible to many more users • Bad visualizations are unhelpful / misleading • “Self-service” visualization needs to be simple, with guidance to help users get meaningful results • “Ready-made” visualizations can be more complex, tailored to specific data sets

  16. Accessibility and Visualization • Is it more cost-effective to: • develop “ready-made” graphics, or • offer users more “self-service” functionality? • Many users don’t have the time or knowledge to produce good visualizations • Advanced users have access to their own visualization and analysis tools

  17. Importance of Clarity • Clarity is all about explaining data • Do current explanatory notes help? • Often written by specialists for specialists • Full of jargon • Too long • Too boring! • Simplified, plain-text versions needed

  18. Other Considerations • Cost / efficiency • Integrity / trust • Reputation of the organization • Professionalism • Adherence to international standards (e.g. UN Fundamental Principles of Official Statistics)

  19. Quality is not just about outputs To have good outputs we need to have good inputs and processes, so we need to think about the quality of these as well Input Process Output

  20. Quality of Inputs • Timeliness • Completeness – are there any missing units or variables? • Comparability with other sources • Quality check survey? • Knowledge of the source is vital!

  21. Quality of Processing • Quality of matching / linking • Outlier detection and treatment • Quality of data editing • Quality of imputation • Keep raw data / metadata to refer back to if necessary

  22. Quality of Outputs • Are the users satisfied? • Are the outputs comparable with data from other sources? • What is the impact on time series? • Are the outputs cost-effective? • Quality reports to measure and communicate differences?

  23. Measuring Quality • Quantitative methods • E.g. confidence intervals • User surveys • Self evaluation • Benchmarking

  24. Quantitative Measures The tops of the bars indicate estimated values and the red lines represent the confidence intervals surrounding them.

  25. UNECE Database User Survey • Launched each autumn on database web site • 10 questions • 150 responses(target 100)

  26. Exercise • Design a user survey with up to 10 questions for users of your web site • 20 minutes

  27. UNECE User Survey Questions 1. Type of user 2. Frequency of use 3. Location (country) 4. Type of data 5. Database relevance 6. Timeliness

  28. Continued... 7. Clarity (metadata) 8. Overall data quality 9. User interface 10. Other comments and questions

  29. Results:Type ofuser

  30. Results:Frequencyof use

  31. Results:Location

  32. Results:Data quality

  33. Results:Userinterface

  34. Improving Our Services • Better timeliness of data • New “Country Overview” data cube to give quick access to key indicators • More content in Russian • Improved user interface • More and better metadata • Statistical literacy

  35. Self-evaluation • Relatively quick and cheap • Is it sufficiently objective? • Needs a standard framework to ensure comparability of quality assessments • Eurostat DESAP check list:http://epp.eurostat.ec.europa.eu/portal/page/portal/quality/documents/desap%20G0-LEG-20031010-EN.pdf

  36. Benchmarking • Comparing data values or data production processes between two sources • Differences can be studied to try to find ways to improve quality

  37. Benchmarking Between Countries Fairly cheap and easy way to get ideas on how to improve statistical processes Mutual benefit - “win - win” Helps to improve international cooperation May lead to joint development projects

  38. Communicating Quality • Quality Reports • Summary – “traffic light” indicator • Red – Serious quality issues, read thequality report before using • Orange – Caution, do not use for important decisions without reading the quality report • Green – Good quality • Intermediate – short quality report(1000 words maximum) • Detailed – full quality report

  39. Detailed Quality Reports • Should cover all components of quality • Should be written for the user • Should be easily accessible • Should follow a standard template

  40. Exercise What should be covered in a detailed quality report? List the topics that should be included 10 minutes

  41. ESQR Contents (1) • Introduction to the statistical process and its outputs • Relevance • Accuracy • Timeliness • Punctuality • Accessibility • Clarity

  42. ESQR Contents (2) • Comparability • Trade-offs between quality components • Assessment of User Needs and Perceptions • Performance, Cost and Respondent Burden • Confidentiality, Transparency and Security • Conclusion

  43. Summary • Quality is all about meeting user needs • There are many different aspects to quality, some of which may be in conflict • E.g. Timeliness versus Accuracy • There are various ways of measuring quality; user views are important • Quality should be communicated to users in a way they can understand

  44. Which is the Best Quality? It depends what the user needs!

  45. Questions?

More Related