1 / 21

Phnom Penh - November , 2011

Evaluation Quality Assessment And CPE Methodology. Division for Oversight Services Evaluation Branch. Phnom Penh - November , 2011. The Challenge. How to Reconcile: The decentralized nature of the Evaluation Function at UNFPA

rianne
Download Presentation

Phnom Penh - November , 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Quality Assessment And CPE Methodology Division for Oversight Services Evaluation Branch Phnom Penh -November, 2011

  2. The Challenge How to Reconcile: • The decentralized nature of the Evaluation Function at UNFPA • A corporate system that ensures good quality and credibility of its evaluations

  3. Why an EQA? • To provide an independent (EB at DOS) assessment of the quality and usefulness of reports for senior managers (HQ and COs) and also for the information of the Executive Board (reporting) • To strengthen evaluation capacity by providing practical feedback and recommendations to evaluation managers • To contribute to organizational learning and Evidence-Based Management

  4. What is EQA? • a detailed and comprehensive assessment of Evaluation Reports produced in/by UNFPA • a powerful information resource about UNFPA evaluations across a variety of settings (database) • a means of connecting the Evaluation Function within UNFPA’s decentralized system

  5. EQA grid: 9 Core Elements • Structure and Clarity of Reporting • Executive Summary • Design and Methodology • Reliability of Data • Findings and Analysis • Conclusions • Recommendations • Meeting Needs

  6. Example EQA CP report

  7. EQA Malawi cont.

  8. EQA Malawi cont.

  9. Scoring grid

  10. What Makes a Good Evaluation Report? • A credible report • That addresses the evaluation purpose and objectives • Based on evidence • And therefore can be used with confidence

  11. What are our findings? Example 2: The scope of the evaluation is not sufficiently focused especially in view of the fact that: (i) this evaluation follows on the MTR which took place a year earlier; (ii) the poor quality of the CPAP P&T tool where baseline (and targets) were highly limited and none of these indicators have been tracked sufficiently to measure progress throughout the CP implementation. Finally, ToR should not request that CP’s impact is assessed (this evaluation is performed during the 4th year of a 5-year CP). The methodological design (and its limitations) is well explained and evaluators’ findings appear to be based on credible data and, in turn, support a rigorous analysis presented in a clear manner. Although recommendations stem logically from findings they are not presented in a priority order, nor are the main ones (solely) presented in an Executive Summary which is way too long and detailed and, as a result, does serve its purpose of providing an overview of the evaluation, scope, objectives, methodology and results.

  12. Next Steps (1) • Quality Control: annex EQA grid (and explanatory note) to all evaluations ToRs • Memo sent to Regional M&E Advisers (PD) and to Reps as well as ROs (DOS) • Evaluation Reports & EQAs are made available on the Database (internet)

  13. Evaluation Database http://web2.unfpa.org/public/about/oversight/evaluations/

  14. Evaluation Database Reports are searchable by: • Region • Country • Language • Year of Evaluation • Focus Area • Keywords

  15. Next Steps (2): DOS CPE METHODOLOGY • Develop a pilot methodology for Country Programme evaluations that could then be replicated by/in other Country Offices using the Cameroon AND Bolivia pilot cases as illustrations of this methodology • Contribute to raising the quality of CPEs and consequently the use of evaluation evidence based results and lessons learned in the preparation of subsequent programming cycles.

  16. Phases of the Evaluation Process We are here

  17. The evaluation criteria in a country programme evaluation COUNTRY PROGRAMME EVALUATION Component 2 Analysis of the strategic positioning Component 1 Analysis of the focus areas Strategic Alignment Relevance Efficiency Responsiveness Evaluation criteria Evaluation criteria Effectiveness Added value Sustainability

  18. BOLIVIA LACRO Lower middle-income Country CAMEROON Africa Lower-income Country Piloting the Methodology Different Contexts and levels of development Different Challenges : Maternal Mortality : Decreased in Bolivia: 416/1989 – 229/2003 Increased in Cameroon: 430/1998 – 669/2004 Different Approaches Both Presenting a New CP to the EB – June 2011

  19. Country Programme Evaluation : Objectives To provide an independent evaluation of the progress or lack thereof, towards the expected outcomes envisaged in the UNFPA programming documents. Where appropriate, the evaluation will also highlight unexpected results (positive or negative) and missed opportunities; To provide an analysis of how UNFPA has positioned itself to add value in response to national needs and changes in the national development context; To present key findings, draw key lessons, and provide a set of clear and forward-looking options leading to strategic and actionable recommendations for the next Programming cycle.

  20. Outputs • Bolivia : Field work: June 11 - 30 • Cameroon: Field work: July 11 – 29 • Final Reports: October 2011 • CPE Methodology : December 2011 • Training in RO Jo’burg: March 2012

  21. Thank you

More Related