1 / 38

Process Evaluation: Considerations and Strategies

Process Evaluation: Considerations and Strategies. CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health. Making Widgets. Count not only the widgets, but who gets the widgets and what goes into making the widgets. . Definitions.

hisano
Download Presentation

Process Evaluation: Considerations and Strategies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health

  2. Making Widgets Count not only the widgets, but who gets the widgets and what goes into making the widgets.

  3. Definitions • Systematic examination of coverage and delivery • Measuring inputs into the program • Finding out if the program has all its parts, if the parts are functional and operational

  4. Program Components Discrete interventions, or groups of interventions, of the overall program that are designed to independently or synergistically effect recipients. Objectives per component

  5. Types of Implementation Evaluations • Effort: quantity of input • Monitoring: use of MIS information • Process: internal dynamics, strengths and weaknesses • Component: assess distinct program parts • Treatment: what was supposed to have an effect

  6. Purpose of Process Evaluation • Assurance and accountability • Understanding of outcomes • Mid-course corrections • Program replicability

  7. From Perspective of: • Evaluator • Funders (Accountability) • Management for the program

  8. Stakeholders and Expectations • Focus on the explicit • Objectives • Program descriptions • Uncover the implicit • Review program theory • Review objectives • Role play possible outcomes

  9. Program Theory Components

  10. Organizational Plan How to garner, configure, and deploy resources, organize program activities so that the intended service is developed and maintained

  11. Service Utilization Plan How the intended target population receives the intended amount of the intended intervention through interaction with the program’s service delivery system

  12. Rigor in Process Evaluation • Appropriateness of the method • Sampling strategy • Validity and Reliability • Timing of data collection

  13. Key Decision: 1 How much effort to expend ~ what data are needed to accurately describe the program. Choose based on: • Expected across site variation • Making report credible • Literature about the intervention

  14. Key Decision: 2 What to look for ~ what program features are most critical, valuable to describe. Choose based on: • What is most often cited in program proposal • The budget • What may be related to program failure

  15. Go Back to Objectives • Process objectives per program component: • How much • Of what • Will be done • By who • By when

  16. Components and Objectives

  17. Possible Foci of Process Evaluation • Place: Site, Program • People • Practitioner/provider • Recipient/participant • Processes • Activities • Processes • Structure • Policy

  18. Levels of Analysis • Individuals • Program participants • Program providers • Programs • As a whole • Geographic locations • Regions and state

  19. Types of Questions ? • What was done and by whom? • How well was it done and how much was done? • What contributed to success/failure? • How much of what resources were used? • Is there program drift?

  20. Sources of Program Variability • Staff preferences and interest • Materials availability and appropriateness • Participants expectations, receptivity, etc • Site physical environment and organizational support

  21. Roots of Program Failure

  22. Causes of Program Failure • Non-Program • No participants • No program done • Wrong intervention • Not appropriate for the problem • Unstandardized intervention • Across site, within program variations

  23. Program Failure cont • Mis-management of program operations • Wrong recipients • Barriers to the program • Program components unevenly delivered, monitored

  24. Data Sources from Program • Resources used • Participant provided data • Quality • Match with process evaluation • Existing records • Sampling of records • Validity and reliability issues

  25. Data Sources from Evaluator • Surveys and Interview of Participants • Observation of Interactions • Survey and Interview staff

  26. Evaluating Structural Inputs • Organizational structure • Supervision of staff • Place in organizational hierarchy • Facilities, equipment • Human resources • Leadership • Training

  27. Measures of Delivery • Measures of program delivery • Measures of coverage • Measures of effectiveness

  28. Measures of Implementation • Measures of Volume (Outputs): • Number of services provided • Measures of Workflow: • Client time • Staff work time

  29. Targets, Recipients, and Coverage

  30. Measures of Coverage Undercoverage= # recipients in need /# in need Overcoverage= # recipients not in need /# recipients Coverage efficiency= (under - over) x 100

  31. Measures of Effectiveness Effectiveness Index = % reached per program standard per program component Program Effectiveness Index = Sum of Effectiveness Indexes/# program components

  32. Bias in Participation • due to self-selection • results in under or overcoverage • may be related to recruitment • can be identified with good data collection (monitoring)

  33. Measures of Efficiency • ratio of input per output • productivity per staff, per cost, per hour • cost per participant, per intervention • etc...

  34. Evaluating Costs • Payments by agency • Payments by secondary funders • Payments by participants versus charges!

  35. Monitoring and CQI • Similar types of data presentation • Control charts • Fishbone diagrams • Flow charts • Gantt charts • etc. • Overlapping purposes

  36. Reaching Conclusions • Compare data to objectives • Compare data to needs assessment data • Compare data to other sites or other programs

  37. Worksheet Exercise • For each program objective: • What is the focus and level of the process evaluation • What data sources needed • Who collects data

  38. References Rossi, Freeman & Lipsey (1999). Evaluation: A systematic approach. Sage Publications Patton (1997). Utilization focused evaluation. Sage Publications. King, Morris, Fitz-Gibbon (1987). How to assess program implementation. Sage Publications. Weiss (1972). Evaluation Research. Prentice Hall

More Related