1 / 23

Used with Permission of: John R. Slate

A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used?. Used with Permission of: John R. Slate. Presentation Outline. Definitions Purposes Types Key concepts of evaluative research Research designs

gordy
Download Presentation

Used with Permission of: John R. Slate

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A brief overviewWhat is program evaluation?How is an evaluation conducted?When should it be used?When can it be used? Used with Permission of: John R. Slate

  2. Presentation Outline • Definitions • Purposes • Types • Key concepts of evaluative research • Research designs • Requirements for program evaluation

  3. A Definition of “Program” • “An organized set of resources and activities directed toward a common purpose or goal”

  4. Two Definitions of “Program Evaluation” • “... an assessment, through objective measurement and systematic analysis, of the manner and extent to which Federal programs achieve intended objectives.” (Source - Government Performance and Results Act (GPRA)) • “The application of scientific research methods to assess program concepts, implementation and effectiveness” (Source - General Accounting Office, Designing Evaluations, Report GAO/PEMD-10.1.4)

  5. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Make decision to create program and set strategic direction 5

  6. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Make decision to create program and set strategic direction Determine what the program will do and how it will do it , set targets and program objectives 6

  7. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Make decision to create program and set strategic direction Determine what the program will do and how it will do it , set targets and program objectives Create program infrastructure & management, administrative & information systems, develop performance measures 7

  8. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Make decision to create program and set strategic direction Determine what the program will do and how it will do it , set targets and program objectives Create program infrastructure & management, administrative & information systems, develop performance measures Determine required levels of human resources and material support 8

  9. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Make decision to create program and set strategic direction Determine what the program will do and how it will do it , set targets and program objectives Implement action plan (Program operation & management, performance measurement, corrective action, etc..., .) Create program infrastructure & management, administrative & information systems, develop performance measures Determine required levels of human resources and material support 9

  10. Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process? Program effectiveness, impact, efficiency evaluations to determine the continue need for the program, alter program design, resource requirements, etc.. Make decision to create program and set strategic direction Determine what the program will do and how it will do it , set targets and program objectives Implement action plan (Program operation & management, performance measurement, corrective action, etc..., .) Create program infrastructure & management, administrative & information systems, develop performance measures Determine required levels of human resources and material support 10

  11. Types of Program Evaluations • Formative - Judging the worth of a program while activities are forming or in process • focus is on the process more than the outcome • can help make in-process improvements • often involves a small scale field test 11

  12. Types of Program Evaluations • Formative - Judging the worth of a program while activities are forming or in process • focus is on the process more than the outcome • can help make in-process improvements • often involves a small scale field test • Summative - Judging the effectiveness of a fully operating or completed program • focus in on outcome and overall program worth • can help decisions to expand, terminate, modify • usually encompasses the entire program 12

  13. Questions Asked by Program Evaluations* • Descriptive - Statistics on inputs, outputs, and outcomes; how does the program work? • Normative - What is the expected performance (goal) of the program in relation to actual achievement? • Impact - If a goal is not met, why not? Program evaluations must establish a cause/effect relationship between an unmet goal and program activities or other, external factors *from Designing Evaluations, GAO/PEMD-10.1.4

  14. Decisions Program Evaluation Can Help Make • Continue or discontinue a program • Improve policies and procedures • Add or drop specific program elements • Institute similar programs elsewhere • Allocate resources • Accept or reject approaches & theories

  15. Key concepts of Evaluative Research According to standards in the research and academic communities, program evaluations should strive for scientific proof and method, i.e.., they should be: • empirical - based on valid, reliable data • replicable - study can be repeated in exactly the same way in another time, place or setting • falsifiable -hypothetical cause/effect relationships can be demonstrated or not

  16. Evaluation Methodology(highly simplified) • Define program • Identify outcome goals and objectives • Model hypothetical cause/effect relationships between program activities and outcomes • Develop goal/objective measurement criteria and desired achievement levels • Locate, collect and analyze data on program participants (and maybe control group) • Compare actual results with target levels

  17. Evaluation Strategies and Designs* *Based on Designing Evaluations, GAO/PEMD-10.1.4 17

  18. Evaluation Strategies and Designs* *Based on Designing Evaluations, GAO/PEMD-10.1.4 18

  19. Evaluation Strategies and Designs* *Based on Designing Evaluations, GAO/PEMD-10.1.4 19

  20. Evaluation Strategies and Designs* *Based on Designing Evaluations, GAO/PEMD-10.1.4 20

  21. Evaluability Assessment • A study to determine if, when, and how a program can be evaluated • Prevents premature impact and outcome evaluations • Usually involves: • clarification of the intent of the program • formulation of “testable” cause/effect statements • determines what can be measured and how • assesses validity, reliability, and relevance of data on which the evaluation would based

  22. Circumstances When ProgramEvaluation is Not a Good Idea • When there is no question about the program • When there is no clear program structure or focus • When program activities cannot be distinguished from other activities • When people cannot agree on what the program is trying to achieve • When valid measurement methods and data do not exist and cannot be created • When the study would serve no useful purpose

  23. OverviewProgram Evaluation Prerequisites • An operating “program”, i.e.., a distinct set of activities and resources with a common purpose and focus • Agreement on program goals and objectives • Agreement on program metrics for its goals and objectives • Agreement on what constitutes program “success” • Existence, or ability to develop, valid data • Absence of legal, administrative, or cultural barriers to the study • Agreement on intended use of the study

More Related