1 / 41

Data Analysis 201

Data Analysis 201. Lakshmy Menon, MPH. Health Scientist, Program Evaluation Team TB Program Evaluation Network Meeting September 19, 2012. National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention. Division of Tuberculosis Elimination. Describe different types of evaluation

tea
Download Presentation

Data Analysis 201

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Analysis 201 Lakshmy Menon, MPH Health Scientist, Program Evaluation Team TB Program Evaluation Network Meeting September 19, 2012 National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination

  2. Describe different types of evaluation Describe qualitative and quantitative measures for PE Identify appropriate evaluation data sources and types of analysis for use in program evaluation (PE) Describe how to interpret conclusions drawn from measures Share examples from the frontline of PE data sources, measures, analysis, and conclusions Objectives

  3. What are you trying to evaluate? What are your evaluation questions? How do you intend to use the findings? Does your evaluation include multiple populations/target audiences? Choose the type of evaluation

  4. Formative Needs assessment Evaluability assessment Implementation evaluation Process evaluation Summative Outcome evaluation Impact evaluation Cost-effectiveness and cost-benefit evaluation Types of Evaluation

  5. Needs assessment Who What Magnitude Potential strategies Evaluability assessment Feasibility Stakeholder-oriented Formative Evaluation (1)

  6. Implementation evaluation Monitoring fidelity of implemented activities to planned activities Process Explores the method/process of delivering activities or interventions Formative Evaluation (2)

  7. Internal evaluation for judging the worth of a program as you identify and develop program activities Helps you find out if you are achieving your goals and objectives instead of waiting until you have finished the project Type of monitoring: is method and content of intervention working? Can strengthen and improve activities in progress Formative Evaluation (3)

  8. Outcome evaluation Determines whether activities or interventions affected defined outcomes Impact evaluation Broad; assesses intended and unintended consequences effects of activities Experimental and quasi-experimental designs Require greater control over program setting Gold standard Summative Evaluation (1)

  9. Cost-effectiveness (CE) and cost-benefit (CB) analyses Determines whether activities where efficient in units of cost (dollars or other pre-determined, defined values) Summative Evaluation (2)

  10. An external evaluation judges whether the program activities achieved the intended outcome Conducted after an intervention is implemented Describes whether activities caused outcomes, specifically Summative Evaluation (3)

  11. Formative What is the definition and scope of the problem? What is the problem? What is the magnitude of the problem? Types of Evaluation and Evaluation Questions (1)

  12. Summative What type of evaluation is feasible? Did the activities contribute to the observed effects Quasi-experimental Experimental What is the net impact of the program? Econometric methods Qualitative methods Types of Evaluation and Evaluation Questions (2)

  13. What is measurement? Process of observing and recording observations Set of methodological procedures intended to translate constructs into observables Levels of measurement Assists in deciding how to interpret data from variables Assists in deciding what statistical analysis is appropriate Nominal, ordinal, interval, ratio Measurement and Measures

  14. Quantitative analysis Tests of difference Means and proportions One vs. two groups Cause and effect Relationship Prediction Type of Analyses (1)

  15. Types of Analyses (2) • Qualitative Analysis • Thematic analysis • Words, phrases, ideas, reflecting thoughts and feelings • Can examine via stratification (men/women; nurse provider/physician) • Quantitative and qualitative methods, data, and analyses support each other!

  16. Measure Nominal Categorical data, order does not matter Gender and sex, marital status, race/ethnicity Site of TB disease, HIV status Thematic categories Analysis Quantitative Qualitative Measures and Types of Analyses (1)

  17. Measures and Types of Analyses (2) • Measure • Ordinal • Order matters but not differences between values • Health status (Very good  very poor) • Analysis • Quantitative • Qualitative

  18. Measures and Types of Analyses (3) • Measure • Interval • Differences between values matter (e.g., temperature scale, SAT scores) • Analysis • Quantitative

  19. Measures and Types of Analyses (4) • Measure • Ratio • Zero point clearly defined (absolute zero) • Height, weight • Kelvin temperature scale • population, income • Analysis • Quantitative

  20. Information systems Surveillance systems, EHR/EMRs Documents Medical records, meeting minutes, organizational, activity logs documents, insurance records, hospital records Individuals Staff, providers, patients, other partners Observations Evaluator observations of staff environment, office flow, activities Data Sources (1)

  21. National and State TB Program Objectives RVCT State surveillance reports and systems NTIP ARPE GIMS EDN Census Local surveys Data Sources (2)

  22. Surveillance reports Morbidity, mortality, other disease and demographic data Surveys Knowledge, attitude, practice Individual interviews Knowledge, attitude, practice Data Collection Methods (1)

  23. Data Collection Methods (2) • Focus groups • Knowledge, attitudes, practices • Observation • Process, procedure, environment • Document reviews • Organizational/hierarchical/systems level information on process and procedures

  24. Quantitative Means Proportions Cause and Effects Relationships and Prediction Interpreting Results (1)

  25. Interpreting Results (2) • Qualitative • Content Analysis • Thematic Analysis

  26. Examples of Evaluation Questions, Measures, and Analyses

  27. Needs Assessment: Questions, Measures, Analyses (1) • Program capacity to conduct contact investigations • Questions • What does staff know? • what extent? • What do they not know? • what extent? • What other needs do they have?

  28. Needs Assessment: Questions, Measures, Analyses (2) • Measures: • Nominal • Type of infectious disease knowledge/TB knowledge • Ordinal, interval • Past experience performing tasks; level of comfort performing tasks (patient interviews, counseling patients on LTBI therapy, etc.) • Analysis • Quantitative • Qualitative data

  29. Implementation Evaluation: Questions, Measures, Analyses (1) • New data quality procedures and activities • Questions • What were the planned data quality activities? • What protocols, processes and activities were implemented? • Did the protocols, processes, and activities follow the plan?

  30. Implementation Evaluation: Questions, Measures, Analyses (1) • Measures • Nominal • Checklists of quality standards • Data collection, entry, and management documents • Training materials and training of appropriate staff • Staff conducting activities • Analysis • Quantitative • Qualitative

  31. Evaluation Questions Design Data sources Measures Analyses Conclusions Examples from the Field

  32. Questions?

  33. AcknowledgementsTerry Chorba, MD DScJason Cummins, MPHCheryl Kearns, MPHAwal Khan, PhD

  34. Resources • Survey research methods - ?? • Utilization-focused evaluation - Patton • Program evaluation and performance measurement - ?? • Qualitative Research – Silverman, 2009 Please email for a comprehensive list of resources

  35. Backup Slides

  36. Lower vs higher levels of measurement Lower (nominal, ordinal): assumptions are less restrictive and analyses are less sensitive Higher (interval, ratio): adds something new to the qualities of lower levels Developing Measures (3)

  37. Data can Help evaluate program effectiveness and maintain focus on outcomes Provide feedback to internal and external stakeholders on program successes, challenges, and needs Assist in gaps that may not be evident on a micro level Using Data in Evaluation

  38. Evaluability assessments Stakeholders Phase in program cycle Organizational capacity Evaluation, Data, and Data Sources (2)

  39. Implementation Evaluation Program protocols Program documents and registries Meeting minutes Patient charts Process Evaluation Program protocols Staff reports Evaluation, Data, and Data Sources (3)

More Related