1 / 41

Focus On… “ Data Collection Choices ”

Focus On… “ Data Collection Choices ”. Presented by: Tom Chapel. This Module…. Why and how of : Developing indicators Making good data collection choices Using mixed methods effectively. CDC’s Evaluation Framework. Indicator development bridges evaluation focus and data collection.

bevis
Download Presentation

Focus On… “ Data Collection Choices ”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Focus On…“Data Collection Choices” Presented by: Tom Chapel

  2. This Module… • Why and how of: • Developing indicators • Making good data collection choices • Using mixed methods effectively

  3. CDC’s Evaluation Framework • Indicator development • bridges evaluation focus • and data collection STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence

  4. What is an indicator? Specific, observable, and measurable characteristics that show progress towards a specified activity or outcome.

  5. Why Indicators? • “Gray” area between abstract concepts framed in evaluation questions and methods/sources of data collection • Indicators “operationalize” – restate abstract concepts in a tangible way • Tangible indicators help find/match appropriate data sources/methods • May, but need not, be S-M-A-R-T objectives

  6. Selecting Good Indicators 1. Construct Validity The indicator measures an important dimension of the activity or outcome. i.e., measure “quality” or “timeliness”.

  7. Selecting Good Indicators 2. Measure the activity or outcome itself, NOT the “fruits” or “so what” of the activity or outcome. For example: What constitutes a measure of good training? “Successful training implementation” is an indicator for good training. “Did participants learn something?” is a fruit of good training.

  8. Selecting Good Indicators 3. There must be at least one indicator for each activity or outcome of interest-- BUT, you may need multiple indicators. The use of multiple indicators is called “triangulation”.

  9. Good Indicators Can Vary in Level of Specificity

  10. Provider Education: Our Evaluation Focus • Activities:Outcomes: • Conduct trainings Provider KAB increase • MD peer education and rounds Provided policies • Nurse Educator presentation to LH Providers know registry and their role in it • Activities: Outcomes: • Providers attend trainings and rounds Providers motivation to do • Providers receive and use Tool Kits immunization increases • LHD nurses do private provider consults

  11. Provider Education: Possible Indicators • Activities:Indicators: • Providers attend trainings Number of participants in trainings • and rounds Number of participants completing series of trainings • Per cent participants by discipline • Per cent participants by region

  12. Provider Education: Possible Indicators • Activities:Indicators: • Providers receive and Per cent providers who report use Tool Kits use of toolkit • Number of “call-to-action” cards received from toolkit

  13. Data Collection Choices The Framework approach emphasizes use of findings: Not “Collect Data”, BUT “Gather Credible Evidence” Not “Analyze Data”, BUT “Justify Conclusions”

  14. Characterizing Data Collection Methods and Sources • Primary vs. secondary • primary: collecting data for first time for the purpose of this project • secondary: making use of pre-existing data • Obtrusive vs. unobtrusive: • to what extent does the respondent know that data are being collected • Quantitative vs. qualitative • quantitative: deals with numbers • qualitative: deals with descriptions

  15. Quantitative and Qualitative • Quantitative → Quantity • Numbers - data which can be measured. • Length, height, area, volume, weight, speed, time, temperature, humidity, sound levels, cost. • Qualitative → Quality • Descriptions - data can be observed but not measured. • Colors, textures, smells, tastes, appearance, beauty, etc.

  16. Six (Most) Common Ways to Collect Data People Surveys Interviews Focus groups Observation Document review Secondary data

  17. CDC’s Evaluation Framework • Standards inform • good choices • at Step 4 STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence

  18. Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility • Propriety • Accuracy

  19. Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility - What is the purpose of the data collection? • Feasibility • Propriety • Accuracy

  20. Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility - How much time? How much cost/budget? • Propriety • Accuracy

  21. Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility Propriety - Any ethical considerations? • Accuracy

  22. Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility • Propriety • Accuracy - How valid and reliable do data need to be? What does “valid” and “reliable” mean in context of study?

  23. Trade-offs of Different Data Collection Methods • Method/Factor Personal Interview Focus Groups Document Review Survey: Phone Secondary Data  Observation • Time • Cost • Sensitive Insures • Hawthorne Effect • Ethics • Survey: Mail

  24. Example 1:Sexual Behavior of High School Males • Point-in-time estimate— sexual behavior of high school males • Indicator: • What % of high school males have had a sexual encounter by the end of their junior year? • Criterion: • Sensitive issue (consider accuracy)

  25. Example 1:Sexual Behavior of High School Males • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? • WHY?

  26. Example 1:Sexual Behavior of High School Males • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Focus groups • WHY? Sensitive issue - peer group is likely to distort responses.

  27. Example 1:Sexual Behavior of High School Males • Sexual behavior of high school males. • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Surveys • WHY? Anonymous (more accurate)

  28. Example 2: Intimate Partner Violence • Understanding context—intimate partner violence • Indicator: • Understand context and identify patterns of intimate partner violence. • Criterion: • Sensitive issue (consider accuracy)

  29. Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? • WHY?

  30. Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Surveys • WHY? Unethical and will not elicit the • data we need (consider utility).

  31. Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Interviews or focus groups • WHY? Build rapport through shared experiences

  32. Example 3: Reduce Lead Burden in Household • Aggressive housekeeping and nutrition behaviors to reduce lead burden. • Indicator: • Assess adoption of housekeeping and nutrition behaviors. • Criterion: • Sensitive issue • Hawthorne effect

  33. Example 3: Reduce Lead Burden in Household • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Surveys, interviews • WHY? Inaccurate(desire to give socially acceptable responses)

  34. Example 3: Reduce Lead Burden in Household • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Observation (garbage, coupons) • WHY? Passive and unobtrusive

  35. The Best Method Depends on the Specific Situation • All three examples involve a sensitive issue: • sexual behavior • intimate partner violence • good nutrition and housekeeping • Even though the criterion (sensitive issue) was the same, the best data collection method was different for each situation.

  36. Provider Education: Our Evaluation Focus • Activities:Outcomes: • Conduct trainings Provider KAB increase • MD peer education and rounds Provided policies • Nurse Educator presentation to LH Providers know registry and their role in it • Activities: Outcomes: • Providers attend trainings and rounds Providers motivation to do • Providers receive and use Tool Kits immunization increases • LHD nurses do private provider consults

  37. Provider Education: Possible Indicators • Activities:Indicators: • Providers attend trainings Number of participants in trainings • and rounds Number of participants completing series of trainings • Per cent participants by discipline • Per cent participants by region

  38. Provider Education: Possible Methods • Providers attend trainings and rounds • Indicators Methods/Sources • Number of participants in trainings Training logs • and rounds • Number of participants completing Registration info series of trainings • Per cent participants by discipline • Per cent participants by region

  39. Provider Education: Possible Methods • Providers receive and use Tool Kits • Indicators Methods/Sources • Per cent providers who Survey of providers • report use of toolkit • Number of “call-to-action” cards Analysis/count of call-to-action received from toolkit cards

  40. Mixed Methods: Definition • A combination of methods that has • complementary strengths and • non-overlapping weaknesses. • The purpose is to • supplement orcomplement • the validity and reliability of the information.

  41. Why Mixed Methods? • “The Cs and the Es” • Corroboration and Clarification • understanding more defensibly, validly, credibly • ”triangulation” • Explanation and Exploration • understanding more clearly • understanding the “why” behind the “what”

More Related