1 / 36

Qualitative Methods for Health Program Evaluation

Qualitative Methods for Health Program Evaluation. CHSC 433 Module 5/Chapter 12 L. Michele Issel, PhD UIC School of Public Health. “ Different kinds of problems require different types of data.” Patton, 1997. Objectives. List the major qualitative designs

morrison
Download Presentation

Qualitative Methods for Health Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Qualitative Methods for Health Program Evaluation CHSC 433 Module 5/Chapter 12 L. Michele Issel, PhD UIC School of Public Health

  2. “Different kinds of problems require different types of data.” Patton, 1997

  3. Objectives • List the major qualitative designs • List at least one pro and con for each of the major qualitative designs • Provide an outline of how qualitative data analyses are done

  4. Beyond the Paradigm Debate • History of science favored quantitative (empiricism), deductive hypothesis testing, logical postivism • Current science favors understanding based on rigorous methods

  5. Use Qualitative when • Want to minimize research manipulation by studying natural field setting. • Program aims at individual outcomes (so when program aims at common outcomes across individuals, use quantitative methods).

  6. Key Characteristics • The use of non-numeric data, such as narratives, pictures, music • The use of subjective, experiential, naturalistic inquiry to explain phenomena • Use of inductive, iterative analysis • Holistic and contextual concerns • Pays attention to individual’s uniqueness

  7. Functions of Qualitative Methods (Adapted from Green & Lewis ‘86) 1. Develop and delineate program elements 2. Booster power of quantitative designs 3. Broad the observational field 4. Analyze processes and cases to understand why or how the program worked 5. Generate a program or intervention theory 6. Use instead of quantitative methods

  8. Underlying Perspectives Phenomenology- experiences and meanings Ethnography- culture Critical analysis- communication and power Grounded Theory- discovery of theory Content Analysis-manifest meanings in the written word

  9. Phenomenology Ethnography Critical analysis Grounded theory Content analysis What does it mean to the person? What are the norms and values (culture)? How has power shaped it ? What are the relationships (theory)? What themes are in the text? Perspective --> Question

  10. Major Types of Qualitative Methods • Participant observation • Case studies • In-depth Interviews • Focus groups • Open-ended survey questions

  11. Participant Observation • Acting as a member of a group, collect data • Make narrative notes and memos about processes, events, people observed • Use key informants to verify data analysis

  12. Case Studies • Define what is a case (organization, program, person) • Use variety of types of raw data generated by or about the case: memoranda, observations, surveys, interviews, etc

  13. Case Study

  14. In-depth Interviews • Use open-ended questions with key individuals (participants, key informants) • Use probes to clarify and explore issues or topics alluded to by the respondent or earlier data analysis • Use tape recorder and transcripts of the interviews

  15. In-depth Interviews

  16. Focus Groups • Carefully selected group of individuals who participate in guided discussion about a specific topic • Use a facilitator and a recorder

  17. Focus Groups

  18. Observations • Non-participatory and Participatory techniques can be used • Need training on what will be observed and how will record the observation • Data collection methods vary: • Audio-visuals recording • Field notes • Logs

  19. Observations

  20. Open-ended survey questions • Use open-ended question placed at end of quantitative survey • Unable to use probes for clarification • Handwriting and spelling can make interpretation difficult

  21. Sampling for Naturalistic Inquiry • Small purposive samples • Select for a specific characteristic • Theoretical sampling • Select based on what “ought” to matter • Sample for category saturation • Select until no new information is gained from participants

  22. Data Analysis • Coding and interpreting the data • To count or not to count

  23. Coding Terminology Category- classification of concepts in the data Dimension- implies continuum Property- attributes or characteristics of a category Constant comparison- process to develop categories, involves comparing new with existing categories Codable- unit of data to be categorized

  24. Analysis Procedures • Identify codable units of data • Understand the meaning • Discover categories • Name categories • Discover properties and dimensions of the categories • Generate explanation

  25. Scientific Rigor = Trustworthiness (Lincoln & Guba, 1985) • Credibility~ Internal validity • Transferability~ External validity • Dependability~ Reliability • Confirmability~ Objectivity

  26. Credibility Have confidence in the truth of the findings by: • Invest sufficient time, triangulate • Use outsiders for insights (peer debriefing) • Refine working hypotheses with negative cases • Check findings against raw data • Use participant feedback

  27. Transferability Applicability to other contexts and respondents • Provide thick (detailed, comprehensive) descriptions for others to assess possibility of transferability

  28. Dependability Find same results if repeated the study • Leave a trail that can be followed so that others can see the findings are supported by the data

  29. Confirmability Findings are from the respondents not the researcher • Leave an Audit Trail (same as for Dependability)

  30. To count or not to count • Number of participants who mentioned a category • Number of times category mentioned throughout the study • Issues…. Neither help with interpretation of meanings, both can misrepresent the scope of the sentiment

  31. From Data to Description • Categories as typologies are rudiments of a theory • Category dimensions and properties as essential • Linkages between categories form theory

  32. Data Presentation • Use descriptions of context to show transferability • Use tables showing category development to show dependability and confirmation • Use participants’ words to show confirmation • Use diagrams of relationships among categories

  33. Realities of Data Analysis • Messy, confusing, repetitive • Iterative category development • Overwhelming quantities of data • Conflicting interpretations of data by peers and participants • Manifest versus implied meanings cloud data analysis • Investigator biases

  34. Cost of Data Collection • Interview time • Travel to interviewee • Reading and listening • Transcription time • 1 hour interview: 3 hours transcribing • Data analysis time

  35. Evaluation Caveats • Integrate with quantitative data • Use of participant feedback (credibility) as key to acceptance of findings • Stories are more powerful than numbers and make the numbers more human

  36. Qualitative MethodsAcross the Pyramid Each qualitative method has potential usefulness for programs at each level of the Pyramid.

More Related