1 / 36

Collecting High Quality Outcome Data, part 2

Collecting High Quality Outcome Data, part 2. Learning objectives. By the end of this module, learners will understand: Steps to implement data collection, including developing a data collection schedule, training data collectors, and pilot testing instruments

lin
Download Presentation

Collecting High Quality Outcome Data, part 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collecting High Quality Outcome Data, part 2

  2. Learning objectives • By the end of this module, learners will understand: • Steps to implement data collection, including developing a data collection schedule, training data collectors, and pilot testing instruments • Data quality as it relates to reliability, validity and minimizing sources of bias • Key concepts are illustrated using a variety of program measurement scenarios.

  3. Agenda • Implementing data collection • Developing a data collection schedule • Training data collectors • Pilot testing instruments • Ensuring data quality • Reliability • Validity • Minimizing bias • Summary of key points; additional resources

  4. How to use this course module • Go through the sections of the course module at your own pace. Use the “back” arrow to return to a section if you want to see it again. • The audio narrative on the course is automatic. If you prefer to turn it off, click on the button that looks like a speaker. • There are several learning exercises within the course. Try them out to check on your understanding of the material. There is no grade associated with these exercises. • Practicum materials are provided separately for use by trainers and facilitators.

  5. Steps to implement data collection • After identifying a data source, method and instrument: • Identify whom to work with to collect data • Set a schedule for collecting data • Train data collectors • Pilot test the data collection process • Make changes • Implement data collection FOR BEST RESULTS make key decisions about how to implement data collection BEFORE program startup!

  6. Creating a data collection schedule Identifies who will collect data, using which instrument, and when Share with team to keep everyone informed Include stakeholders in planning Include dates for collecting, analyzing, and reporting data Use any format you like

  7. Training data collectors • Determine best person(s) to collect data • Provide written instructions for collecting data • Explain importance and value of data for program • Walk data collectors through instrument • Practice or role play data collection • Review data collection schedule • Explain how to return completed instruments

  8. Pilot testing for feasibility and data quality • Try out instruments with a small group similar to program participants • Discuss instrument with respondents • Analyze pilot test data toensure the instrument yields the right information • Revise instrument and data collection process based on feedback • Implement data collection! Questions for Debrief How long did it take to complete? What did you think the questions were asking you about? Were any questions unclear, confusing, or difficult to answer? Were response options adequate? Did questions allow you to say everything you wanted to say?

  9. Exercise #1 • Which statement is FALSE? • Data collectors should be trained only when instruments are complex. • Data collectors should receive written instructions on how to administer the instrument. • Data collectors should receive a copy of the data collection schedule. • Train data collectors by walking them through the instrument. • None of the above

  10. Exercise #2 • The main purpose of pilot testing is to: • Determine how well an instrument will work in your context • Find out if respondents like your instrument • Ensure the confidentiality of sensitive data • Determine which of two competing instruments works best

  11. Ensuring data quality: Reliability, validity, bias • Reliabilityis the ability of a method or instrument to yield consistent results under the same conditions. • Validityis the ability of a method or instrument to measure accurately. • Bias involves systematic distortion of results due to over- or under-representation of particular groups, question wording that encourages or discourages particular responses, and by poorly timed data collection.

  12. Reliability • Administer instruments the same way every time • Written instructions for respondents • Written instructions for data collectors • Train and monitor data collectors • Design instruments to improve reliability • Use clear and unambiguous language so question meaning is clear. • Use more than one question to measure an outcome. • Use attractive, uncluttered layouts that are easy to follow.

  13. Ensuring reliability • For Surveys • Avoid ambiguous wording that may lead respondents to interpret the same question differently. • For Interviews • Don’t paraphrase or change question wording. • Don’t give verbal or non-verbal cues that suggest preferred responses. • For Observation • Train and monitor observers to ensure consistency in how they interpret what they see.

  14. Validity • Validity is the ability of a method or instrument to measure accurately. • Instrument measures same outcome identified in theory of change (attitude, knowledge, behavior, condition) • Instrument measures relevant dimensions of outcome • Instrument results corroborated by other evidence

  15. Ensuring validity — example • Academic engagement (attachment to school) • Instrument should measure same type of outcome – in this case, attitudes - as intended outcome • Instrument should measure the outcome dimensions targeted by intervention, including feelings about: • Students showing improvement in attitudes towards school should not exhibit contradictory behavior • - Teachers • - Students • - Being in school • - Doing schoolwork

  16. Sources of bias • Who, how, when, and where you ask • Who: Non-responders = hidden bias • Participants who drop out early • Sampling bias: Participants don’t have equal chance of being selected • How: Wording that encourages or discourages particular responses • When and Where: Timing and location can influence responses (e.g., conducting satisfaction survey right after free lunch) • Bias is often unintentional, but can lead to over- or under-estimation of program results

  17. Minimizing bias • Get data from as many respondents as possible • Work with program sites to maximize data collection • Follow up with non-responders • Take steps to reduce participant attrition • Pilot test instruments and data collection procedures • Mind your language • Time data collection to avoid circumstances that may distort responses

  18. Exercise #3 • Which statement is true? • Reliability refers to the ability of an instrument to: • Produce consistent results • Produce expected results • Produce accurate results • Be used for a long time • Be completed by respondents with ease

  19. Exercise #4 • Which statement is true? • Validity refers to the ability of an instrument to: • Produce consistent results • Produce expected results • Produce accurate results • Be used for a long time • Be completed by respondents with ease

  20. Exercise #5 • Bias can be minimized by doing all of the following except: • Getting data from as many respondents as possible • Sampling program participants • Using neutral language in questions • Pilot testing instruments • Reducing attrition of program participants • Following up with non-responders

  21. Measurement Scenarios • Explore these measurement scenarios to see how programs address issues of reliability, validity, and bias.

  22. Academic engagement

  23. Academic engagement—reliability, validity, bias

  24. Reading ability

  25. Reading ability—reliability, validity, bias

  26. Housing

  27. Housing—reliability, validity, bias

  28. Exercise habits

  29. Physical fitness—reliability, validity, bias

  30. Stream bank improvements

  31. Stream bank improvements—reliability, validity, bias

  32. Capacity Building—recruiting volunteers to serve more people

  33. Capacity Building—reliability, validity, bias

  34. Summary of key points • Steps to implement data collection include identifying the players involved in data collection, creating a data collection schedule, training data collectors, pilot testing instruments, and revising instruments as needed. • A data collection schedule identifies who will collect data, using which instrument, and when. • Training data collectors by walking them through the instrument and role playing the process. • Pilot testing involves having a small group of people complete an instrument and asking them about the experience.

  35. Summary of key points • Reliability, validity, and bias are key criteria for data quality. • Reliability is the ability of a method or instrument to yield consistent results under the same conditions. • Validity is the ability of a method or instrument to measure accurately. • Bias involves systematic distortion of results due to over- or under-representation of particular groups, question wording that encourages or discourages particular responses, and by poorly timed data collection.

  36. Additional resources • CNCS Performance Measurement • http://nationalservice.gov/resources/npm/home • Performance measurement effective practices • http://www.nationalserviceresources.org/practices/topic/92 • Instrument Formatting Checklist • http://www.nationalserviceresources.org/files/Instrument_Development_Checklist_and_Sample.pdf • Practicum Materials • [URL]

More Related