1 / 26

Health Program Effect Evaluation Questions and Data Collection Methods

Health Program Effect Evaluation Questions and Data Collection Methods. CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health. Objectives. Develop appropriate effect evaluation questions List pros and cons for various data collection methods

ailis
Download Presentation

Health Program Effect Evaluation Questions and Data Collection Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health

  2. Objectives • Develop appropriate effect evaluation questions • List pros and cons for various data collection methods • Distinguish between types of variables

  3. Involve Evaluation Users so they can: • Judge the utility of the design • Know strengths and weaknesses of the evaluation • Identify differences in criteria for judging evaluation quality • Learn about methods • Have debated BEFORE have data

  4. Terminology The following terms are used in reference to basically the same set of activities and for the same purpose: • Impact evaluation • Outcome evaluation • Effectiveness evaluation • Summative evaluation

  5. Nature of problem addressed:new knowledge vs assess outcomes Goal of the research: new knowledge for prediction vs social accounting Guiding theory: theory for hypothesis testing vs theory for the problem Appropriate techniques: sampling, statistics, hypothesis testing, etc. vs fit with the problem Differences between Research - Evaluation

  6. Research-Evaluation Differences

  7. Research-Evaluation Differences

  8. Evaluation Questions… • What questions do the stakeholders want answered by the evaluation? • Do the questions link to the impact and outcome objectives? • Do the questions link to the effect theory?

  9. From Effect Theory to Effect Evaluation • Consider the effect theory as source of variables • Consider the effect theory as guidance on design • Consider the effect theory as informing the timing of data collection

  10. From Effect Theory to Variables The next slide is an example of using the the effect theory components to identify possible variables on which to collect evaluation data.

  11. Impact vs Outcome Evaluations • Impact is more realistic because it focuses on the immediate effects and participants are probably more accessible. • Outcomes is more policy, longitudinal, population based and therefore more difficult and costly. Also, causality (conceptual hypothesis) is fuzzier.

  12. Effect Evaluation Draws upon and uses what is known about how to conduct rigorous research: Design --overall plan, such as experimental, quasi-experimental, longitudinal, qualitative Method -- how collect data, such as telephone survey, interview, observation

  13. Methods --> Data Sources • Observational--> logs, video • Record review--> Client records, patient chart • Survey--> participants/not, family • Interview--> participants/not, • Existing records --> birth & death certificates, police reports

  14. Comparison of Data Collection Methods Characteristics of each method to be considered when choosing a method: • Cost • Amount of training required for data collectors • Completion time • Response rate

  15. Validity and Reliability • Method must use valid indicators/measures • Method must use reliable processes for data collection • Method must use reliable measures

  16. Variables, Indicators, Measures • Variable is the “thing” of interest, variable is how that thing gets measured • Some agencies use “indicator” to mean the number that indicates how well the program is doing • Measure the way that the variable is known It’s all just language…. Stay focused on what is needed.

  17. Levels of Measurement

  18. Types of Effects as documented through Indicators Indicators of physical change Indicators of knowledge change Indicators of psychological change Indicators of behavioral change Indicators of resources change Indicators of social change

  19. Advise It is more productive to focus on a few relevant variables than to go on a wide ranging fishing expedition. Carol Weiss (1972)

  20. Variables • Intervening variable: any variable that forms a link between the independent variable, AND without which the independent variable is not related to the dependent variable (outcome).

  21. Variables • Confounding variable is an extraneous variable which accounts for all or part of the effects on the dependent variable (outcome); mask underlying true assumptions. • Must be associated with the dependent variable AND the independent variable.

  22. Confounders • Exogenous (outside of individuals) confounding factors are uncontrollable (selection bias, coverage bias). • Endogenous (within individuals) confounding factors equally important: secular drift in attitudes/knowledge, maturation (children or elderly), seasonality, interfering events that alter individuals.

  23. Variable story… To get from Austin to San Antonio, there is one highway. Between Austin and San Antonio there is one town, San Marcus. San Marcus is the intervening variable because it not possible to get to San Antonio from Austin without going through San Marcus. The freeway is often congested, with construction and heavy traffic. The highway conditions is the confounding variable because it is associated with both the trip (my car, my state of mind) and with arriving (alive) in San Antonio.

  24. Measure Program Impact Across the Pyramid

More Related