1 / 14

Planning and Design of Program Evaluation

Planning and Design of Program Evaluation. Plan Your Program Evaluation Murari Suvedi, Professor www.msu.edu/~suvedi October 21, 2010. Identify A Program You Like to Evaluate and Ask:. What is the goal of your program? Who is your audience (most important one)?

leroy-kane
Download Presentation

Planning and Design of Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning and Design of Program Evaluation Plan Your Program Evaluation Murari Suvedi, Professor www.msu.edu/~suvedi October 21, 2010

  2. Identify A Program You Like to Evaluate and Ask: What is the goal of your program? Who is your audience (most important one)? What are the indicators of change, or what will show that your project is making progress towards your goal. What data do you need to monitor the change? Who should collect data, when, from whom, what cost? How to analyze data and to whom you will present findings?

  3. Assess Evaluation Feasibility Ask the following questions: Have stakeholders agreed upon the intended use of your evaluation? Is there a commitment to use evaluation findings? Is this evaluation a legal requirement? Does your program have enough impact and importance? Will this evaluation provide valid and reliable information? Is your program ready to be evaluated? Are there sufficient resources for this evaluation?

  4. Conditions Unfavorable for Evaluation When the program has few routines and little stability When people involved cannot agree on what the program is trying to achieve When sponsors of evaluation or program managers sets stringent limits When there is not enough money, staff time or qualified persons to conduct evaluation

  5. Consider Hierarchy of Program Performance SEE Change Social, economic, environmental conditions Practice Change Patterns of behaviors, procedures, or actions KOSA Change Knowledge, Opinion, Skills, Aspirations change Reactions Positive/negative reactions Participation Individuals, families, groups, organizations, communities Activities Meetings, workshops, radio/TV programs, newsletters, etc. Resources $, staff time, & material resources in order to conduct program activities

  6. Outcome Hierarchy System/Circumstance Behavior Skills Attitude changes Knowledge Awareness Adapted from Claude Bennett , 1976

  7. Consider Logic Evaluation Framework INPUTS OUTPUTS Activities Participation OUTCOMES-IMPACT Short term Medium term Long term Staff Volunteers Time Money Materials Equipment Technology Partners Workshops Meetings Camps Curriculum Publication Media Web site Projects Test plots Field days Research • Who needs to • - participate? • be involved? • be reached? • Number • Characteristics • Reactions LEARNING Awareness Knowledge Attitudes Skills Aspirations ACTION Behavior Practice Decisions Policies Social Action IMPACT Social Economic Environmental Ecological Technological Context Influential factors

  8. Logic Model and Common Types of Evaluation types of evaluation Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired goals and target met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Source: University of Wisconsin-Extension, Program Development and Evaluation

  9. Choose Indicators for Your Program’s Evaluation Indicators should be… Relevant to the objective Simple and unambiguous Realizable Conceptually well-founded Limited in number Criteria for choosing indicators: Measurable Relevant and easy to use Provide a representative picture of the program Show trends Responsive to changes Has a reference Measured at a reasonable cost An indicator is a marker that can be observed to show that something has changed.

  10. Examples of Outcome Indicators Nutrient Use and Management Nitrogen fertilizer use: Amount of decrease/increase __ lbs/acre Use of Cover Crops: Amount of decrease/increase __ acres Well water quality: Change in nitrate/pesticide levels __ ppm

  11. Examples of Outcome Indicators Quality of life/social benefits No. work hours per day: ___ hrs/acre or head Time for community activities: ___ hrs/week Marketing of farm produce locally: ___% of total Personal & family health: ___ no. of sick days/yr

  12. Consider Utilization-Focused Methods Selection of methods should be guided by: Goal of evaluation Intended use by intended users Credibility of information source Believable and understandable data Reliable measure: A measure is reliable to the extent that essentially the same results can be reproduced repeatedly, as long as the situation does not change.

  13. Let us Plan Your Evaluation:Complete This Worksheet and Get Ready to Share!

  14. If you like to read further about the evaluation models, please visit the following websites: Program Logic Model See full description at: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html Targeting Outcomes of Programs (TOP) Model See full description at:http://citnews.unl.edu/TOP/

More Related