1 / 32

BY APOLLOS B. GOYOL, PhD American University of Nigeria Yola, Adamawa State Nigeria

THE IMPACT AND PRACTICAL UNTILITY OF EVALUATION OUTCOME MEASUREMENT AS A FEED-BACK MECHANISM FOR SUSTAINABLE DEVELOPMENT. BY APOLLOS B. GOYOL, PhD American University of Nigeria Yola, Adamawa State Nigeria Email:apollosgoyol@gmail.com AT International Conference of

maxine-dyer
Download Presentation

BY APOLLOS B. GOYOL, PhD American University of Nigeria Yola, Adamawa State Nigeria

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. THE IMPACT AND PRACTICAL UNTILITY OF EVALUATION OUTCOME MEASUREMENT AS A FEED-BACK MECHANISM FOR SUSTAINABLE DEVELOPMENT BY APOLLOS B. GOYOL, PhD American University of Nigeria Yola, Adamawa State Nigeria Email:apollosgoyol@gmail.com AT International Conference of African Evaluation Association L’Association Africaine d’Evaluation PERSPECTIVES ON IMPACT EVALUATION: Approaches to Assessing Development Effectiveness An International Conference in Africa for policy-makers, program managers, evaluators, sponsors and other stakeholders in evaluation and development Sunday 29 March – Thursday 2 April 2009, CAIRO, EGYPT

  2. What is Outcome measurement evaluation? • Is a continual and systematic process of assessing the value or potential value of programs to guide decision-making for the program’s/projects future and sustainability • Is an essential indicator of how programs and projects should be sustained.

  3. Process Evaluation Resources Activities Units produced Did they do what they said they would do? Outcome Evaluation Measures of change in Individuals Institutions Communities Did they change what they said they would change? Process vs. Outcome Evaluation

  4. Realities of Today in Nigeria • More Money (billions of Naira) • Increased competition • Greater expectations for effectiveness • Increasing scrutiny • Increasing political awareness • Greater need for team/collaboration

  5. Critical Accountability Questions: • You received billions of naira last year---what did your government and agency do with them? • We have supported your party for years, why should we continue this support? • What are you doing to improve or terminate ineffective programs and curtail wastage?

  6. Why evaluate? • Planning purposes • Analysis of program effectiveness or quality • Direct decision-making • Maintain accountability • Project impact assessment • Project impact and sustenance

  7. When to evaluate? • The timing of program/project evaluation: • Project/program design stage • Project start-up stage • In-progress of formative evaluation • Program wrap-up or summative evaluation • Follow –up studies and feed –back • Continues sustenance

  8. When we evaluate a project/program • We examine the context of the project\program • Study its goals and objectives • Collect information about a project’s /program’s input, outcome and impact • Compare findings to some pre-set standards or mandates • Make a value judgment about the project • Report findings to stakeholders

  9. Types of outcome evaluation • Formative or process evaluation: • Focus on information for program/project improvement, modification, management and sustenance • Summative or impact evaluation • Focus on determining program/project results and effectiveness (merit and worth) • Serves the purpose of making major decisions about program – continuation, expansion, reduction, and funding. • Using the Logic model as road map

  10. Evaluators Credibility • Competence: • Knowledge of the program/project • Evaluation expertise • Data analysis and interpretation skills • Report writing and presentation skills • Personal Style: • Communication skills • Strong interpersonal skills • Ability to nurture trust and rapport • Sensitivity in reporting

  11. Logic Model • Identifies both process and outcome portions of your program • Shows relationship of your program inputs to the expected results or outcomes • Helps you identify the major questions you want the evaluation to answer

  12. Logic Model • Provides a graphic summary of how program parts relate to the whole • Makes explicit the underlying theory of a program • Identifies categories to measure in the program evaluation • Sets timelines and benchmarks

  13. Developing a Program Evaluation Logic Model Process Outcome Resources Outputs Outcomes Goals Activities Sample

  14. Resources • Program ingredients • Funds • Staff • Community support • Participants Back

  15. Activities • Method used to accomplish program goals • Classes • Counseling • Training Back

  16. Outputs • Units produced by a program • Number and type of clients served • Number of policies developed • Number of events planned Back

  17. Outcomes • Short term and immediate indicators of progress towards a goal • Collaborative partnerships • Improved family functioning • Improved school performance Back

  18. Goals Long term desired program effects Resilient community Economic self-sufficiency Violence prevention Back

  19. Sample Program Evaluation Logic Model Program Name: School and Community Violence Prevention Project Process Outcome Resources Outputs Outcomes Goals Activities Continue

  20. Sample Resources • Staff • Violence prevention curriculum • Case management services • Partnerships • University • Counseling centers • Sheriff’s department • School district • Participants Back

  21. Sample Activities Commission Funded • Delivery of violence prevention curriculum in the schools • Intensive violence prevention groups to high-risk youth • Intensive outreach services to families with high-risk youth

  22. Sample Activities Concurrent Activities • DARE (Drug Abuse Resistance Education) • Family Empowerment Project services to families Back

  23. Sample Outputs • 4-6 hours of violence prevention education for 1,890 students • 480 students receive intensive prevention training in 10-week groups of 6-10 students each • 185 at risk families receive intensive outreach services • 60 at risk families receive Family Empowerment Project services Back

  24. Sample Outcomes • Improvement in healthy peer social communication • Reduction of violent behaviors in school climate • Increase in healthy behavior patterns in handling stress • Improved school-related behaviors • Increase in family support • Consumer satisfaction Back

  25. Sample Goals Violence prevention Back

  26. Outcomes You Can Measure Example: Desired outcome is improvement in parenting skills.

  27. Indicators • Clear parental expectations for child behavior • Use of problem-solving and communication skills • Increased positive parent-teacher interaction • Increased positive family activities

  28. Indicators • Use of behavior reinforcement mechanisms • Home learning environment supportive of school work • Use of family interaction skills • Reduction of conflict

  29. Documenting Impact of outcome evaluation • Impact is a clear description of value of a program/project to people and society. Generally, these are the longer-term sustainable benefits and impact to clients or society. It could be: • Increased knowledge • Improved attitudes • Financial gain • Production efficiencies • Preservation and environmental resources • Improved condition of health care • Improved condition of roads • Improved condition of education • Improved condition of water supply, etc

  30. Outcome evaluation checklist for government, organizations, agencies, etc • Present a clear concise plan for evaluation achievement of outcome objectives • State what (outcomes and impacts) will be measured with benchmarks • State what methods will be use for collecting data • Describe any testing instruments to be used • State who will do the evaluation • Show how outcome evaluation will be used for program evaluation and sustenance.

  31. Summary/Conclusion • Outcome elevation is a tool to assist, plan, implement, monitor, evaluate, account and sustain programs/projects: • By having a road map that will assist in implementation and sustaining projects/programs • To assist government ministries, departments and agencies (MDAs) to be accountable and effective • To enable government measure the impact and benefit of its projects/programs in consonance with their agendas which are sustainable

  32. The End Thank you and questions

More Related