1 / 19

Quality in Evaluation: the international development experience

Quality in Evaluation: the international development experience. Sophie Davies, Manager Evaluations Support, Program Effectiveness and Performance Division. Presentation Outline. Context Evaluation at AusAID Improving evaluation utility and quality. The Aid Context.

pesparza
Download Presentation

Quality in Evaluation: the international development experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality in Evaluation: the international development experience Sophie Davies, Manager Evaluations Support, Program Effectiveness and Performance Division

  2. Presentation Outline • Context • Evaluation at AusAID • Improving evaluation utility and quality

  3. The Aid Context Current aid program is $4.8 bn (or 0.35% GNI) 89% of Australia’s aid goes through AusAID Bi-partisan commitment to aid budget of 0.5% of GNI or $8 bn by 2015 Donor commitment to 0.7% of GNI never been fully realised

  4. International commitment

  5. Where does aid go? 2011-2012 budget COUNTRIES (TOP 5) Indonesia (558.1m) Papua New Guinea (482.3m) Solomon Islands (261.6m) Afghanistan (165.1m) Vietnam (137.9m)

  6. Reaching the MDGs MDGs: agreed targets to reduce poverty by 2015. Adopted by 189 nations and during UN Millennium Summit in September 2000. Australia re-stated its commitment in 2007

  7. Domestic parameters – aid review • Independent review of aid effectiveness led by Sandy Hollway over the last 6 months.  • Objective: • To examine the effectiveness and efficiency of the Australian aid program and make recommendations to improve its structure and delivery. • Results are being considered by Government. • Will be released towards end of June

  8. Global evaluation parameters: OECD -DAC • International standards for evaluation • DAC Criteria used for quality reports & evaluation: • Relevance • Effectiveness • Efficiency • Impact • Sustainability • Plus AusAID criteria: gender equity, M&E, analysis/ learning

  9. Performance Management & Evaluation Policy • Self-assessment quality reporting balanced by independent evaluation for ‘monitored’ activities • Quality reporting occurs: • Activity: at entry, during implementation and • Program: annual program review • Independent evaluation • At least once every 4 years (IPR) • At end of program within its last 6 months (ICR) • Policy reviewed every 2 years (most recently 2010)

  10. Overarching principles • Clear Objectives: for all aid interventions • Transparency: default position is report publicly available • Contestability and Sound Evidence: Performance reporting subject to contestability; based on sound evidence • Whole of Government and Other Partnerships: seek input and consult with key partners • Aid Effectiveness: Paris Declaration principles, Accra • Efficiency: effort and resources invested proportional to value & context of program

  11. Where Performance and Quality sits at AusAID • Programs: self-assessment; manage evaluations • P&Q network/ managers: • over 230 people • Some with dedicated technical support roles • Quality & Performance Systems section: • Policy and guidance; • Support to programs in applying these • Office for Development Effectiveness: • Quality checks, Annual Review of Development Effectiveness (ARDE) • 2-3 thematic/ country level evaluations per year

  12. Purpose of the PMEP • Management • Improvements to future aid program • Informs program and budget decisions • Learning • What works, when, where and how • Helps to focus funding where it’s most effective, efficient and relevant • Accountability • To public, e.g. Annual Review of Development Effectiveness (ARDE) • To partner governments, communities, Whole of Govt, implementing partners

  13. Improving quality: under ODE • 2006 meta-evaluation: found poor evaluation quality • Changes made: • Revised PMEP/ evaluation guidance based on DAC criteria • Introduction of technical review process • Set up of M&E panel of experts • 2009: PMEP policy and guidance moved from ODE to Operations & Policy (now Program Effectiveness & Performance)

  14. Four reviews of evaluation quality 2011 • Driven by different purposes • Review of technical review process: • to improve evaluation processes • PMEP review • For policy reform • Meta-analysis of independent evaluations (ICRs) • Content review - for independent review of aid effectiveness • Meta-evaluation of education ICRs • For understanding across education sector

  15. Reviews referred to underlying strengths • Good practice exists: • internal annual quality reflections are well utilised to monitor and improve program management • Evaluation report quality has improved • Growing performance culture built around Performance & Quality network • The M&E Panel is well utilised and has helped some programs to improve quality units

  16. But Evaluation utilisation is poor • Reviews identified common issues around: • Focus on output over outcomes/ impact • Poor quality reports; narrow, variable interpretation of criteria • Weak underlying data from M&E systems • Low compliance, • Poor use of information, publication is lagging • Despite different audiences for each review, common message: Evaluation is being driven by accountability, not by management/ learning

  17. What needs to change? • Judicious & strategic use of evaluations • Scope and depth match the evaluation purpose • Focus on results, devt contribution not just outputs • Management see benefit and utility in evaluation • Transparency is improved: • broader public understanding; • improved accountability to public, partners and communities

  18. Shifting the balance: how do we do this? • For greater management utility • Link staff training with support • Improve current guidance (scope vs purpose) • People are accountable for use of evaluation information • For greater learning • More succinct documents which allow for meta-analysis • Good practice examples identified and shared • For better accountability: • Independent aid review should provide direction/ framework for agency accountability

  19. Are we on the right track?

More Related