1 / 22

Program Evaluation Screencast

https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758. Program Evaluation Screencast. Prepared by Mary Secret Based on materials from the following sources:

truax
Download Presentation

Program Evaluation Screencast

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758 Program Evaluation Screencast Prepared by Mary Secret Based on materials from the following sources: Babbie,, E. 2014The Practice of Social Research,(14th edition). Boston, MA:Thomson Wadsworth Corcoran, J. & Secret, M. (2013). Social Work Research Skills Workbook. New York: Oxford Engel, R.J., & Schutt, R.K., (2013). The practice of research in social work (3rd Ed). Thousand Oaks, CA: Sage. ECHO 360 links https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758?ec=true https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758

  2. What is the purpose of program evaluation To investigate social programs To assess effectiveness of social policies and programs.

  3. Program Evaluation Prologue • … is not a specific activity or method that you can point to or associate with any particular step of the research process • … encompasses all aspects of research processes and methods

  4. Major comprehensive program evaluation can: • Include experimental and non-experimental research designs, • Use both qualitative and quantitative approaches, • Collect data from secondary data sources, interview participants, • Use standardized or non-standardized measurement instruments, • Include both probability and nonprobability samples, • Must adhere to the standard research ethics

  5. Program Evaluation … is distinguished from other types of social science research not by the design, method, or the approach … but the underlying intent, the purposes that guide the evaluation process

  6. What is the purpose of program evaluation To investigate social programs To assess effectiveness of social policies and programs.

  7. Question FIRST!!! The specific methods depends on the evaluation question of interest about a specific program, policy or intervention Questions to be answered • Is the program needed? Do a needs assessment • How does the program operate? Do a formative or process evaluation • What is the program’s impact? Do a summative or outcome evaluation • How efficient is the program? Do a cost benefit or a cost effectiveness analysis

  8. The language of evaluation: Fill in the Blank the impact of the program; the intended result; the response variable; the dependent variable Outcomes the services delivered or new products produced by the program process Outputs resources, raw materials, clients, and staff that go into a program In puts Population for whom the program is designed. Target Population individuals and groups who have some basis of concern with the program, often setting the research agenda and controlling research findings Stakeholders information about service delivery system outputs, outcomes, or operations that is available to any program stakeholders Feedback

  9. What is a Needs Assessment Systematically researching questions about the needs of a target population for program planning purposes that obtains information from: Key Informants: expert opinions from individuals who have special knowledge about the needs and about the existing services Rates under treatment: secondary analysis of existing statistics to estimate need for services based on number and characteristics of clients who are already being served. Social Indicators: existing statistics that reflect conditions of an entire population.. i.e. census data, Kids Count data. Rubin and Babbe, (2007) Essential Research Methods for Social Work. Brooks Cole:CA

  10. What is a Process or Formative Evaluation: How do you know whether or not the service was delivered in the manner intended… i.e. according to protocol or evidence based practice model Must measure (collect data) the Independent Variable– the intervention.. What services were actually delivered, i.e. Number of counseling sessions, hours of training, number of meetings, etcetc

  11. What is an Outcome Evaluation: Also known as impact evaluation and summative evaluation Evaluation research that examines the effectiveness of the treatment or other service • Program is independent variable (treatment) • Outcomes are dependent variables • Experimental design is preferred method for maximizing internal validity because of • Random assignment into an experimental group and a control/comparison group • Manipulation of the independent variable

  12. A closer look at Experimental Designs Research Design Notations • R—Random assignment • O—Observation, data collection • X—Intervention or treatment

  13. Classic Experimental Design Controls for selection bias and history and maturation and statistical regression threat to internal validity

  14. Quasi-experimental design Less control for threat to internal validity .. Possibility of selection bias

  15. Pretest/Posttest design (pre-experimental) Least control for threat to internal validity .. History, maturation,

  16. What about Measurement • Use of many different types of measurement tools, … dependent on the intent and type of evaluation research

  17. Independent varialbe Measures for outcome and causal mechanisms Does the program cause change? How does change happen USING MULTIPLE MEASURES and SEVERAL DATA COLLECTION STRATEGIES TO EVAULATE THE FACT PROGRAM Measures for input And program efficiency Measures for Process/Implementation Evaluation What services are being delivered, by who, how?

  18. LOGIC MODEL • many pieces of information that must be organized and then interpreted. • need a way in which this information can be organized.

  19. What is the Logic Model? A schematic representation of the various components that make up a social service program. • Logic models may describe • theory and its link to change (theory approach model) where attention is on the “how and why” a program works • outcomes (outcome approach model) where the focus of the logic model is to connect resources and activities to expected changes • activities (activities approach model) or describing what the program actually does

  20. Short term outcomes.Measured by the research Logic model- outcomes example Program inputs Program processes Long term outcomes/ difficult to measure Identifying the causal mechanism

  21. What’s an ‘Evaluabilty’ Assessment Newly emergent programs that are not fully operational are not ready for, and indeed can be tarnished by a summative evaluation geared to assessing program outcomes. HOW SO?? a systematic process that helps identify whether program evaluation is justified, feasible, and likely to provide useful information*. • determines whether a program is ready for evaluation—either a process or outcome evaluation, or both. • Is the program able to produce the information required for a process evaluation,.. AT WHAT STAGE OF IMPLEMTATION IS THE PROGRAM? • Can a program meet the other criteria for beginning an outcome evaluation. • determines whether a program has the basic foundation for an evaluation to take place * Evaluability Assessment: Examining the Readiness of a Program for Evaluation. Juvenile Justice Evaluation Center Justice Research and Statistics Association. Program Evaluation Briefing Series #6. May, 2003, p. 6 http://www.jrsa.org/pubs/juv-justice/evaluability-assessment.pdf

  22. Evaluabilityof a program based on • ESTABLISHED PROGRAM • measurable outcomes • defined service components • an established recruiting, enrollment, and participation process; • good understanding of the characteristics of the target population, program participants and program environment; • ability to collect and maintain information; • adequate program size • RESEARCH SAVVY SERVICE DELIVERY STAFF • problem solving values and skills • prior experience with evaluation confidence in program • commitment to ‘new knowledge’ • openness to change

More Related