1 / 13

The ALNAP Meta-evaluation

The ALNAP Meta-evaluation. Tony Beck, Peter Wiles and John Lakeman. What is the ALNAP meta-evaluation?. An overview of evaluation of humanitarian action quality Identification of strengths and weaknesses Recommendations for improvement across the sector and in individual agencies.

tamyra
Download Presentation

The ALNAP Meta-evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The ALNAP Meta-evaluation Tony Beck, Peter Wiles and John Lakeman

  2. What is the ALNAP meta-evaluation? • An overview of evaluation of humanitarian action quality • Identification of strengths and weaknesses • Recommendations for improvement across the sector and in individual agencies

  3. The ALNAP Quality Proforma • ALNAP’s meta-evaluation tool • Draws on good practice in EHA and evaluation in general • Revised and peer reviewed this year

  4. 2003-4 meta-evaluation • Rated representative set of 30 evaluations Focus this year at request of ALNAP members on: • Good practice • Dialogue with 11 evaluation offices, focusing of impact of evaluation processes on evaluation quality

  5. Agencies included in dialogue CAFOD, Danida, ECHO, ICRC, OCHA, OFDA, Oxfam, SC-UK, SIDA, UNHCR, and WHO

  6. Findings from dialogue with evaluation managers • Some areas affecting evaluation quality not currently captured by the QP • Evaluation quality depends on subtle negotiations within agencies about key findings, eg staffing, use of DAC criteria • Likely follow-up from recommendations is difficult to predict and dependent on a number of processes in agencies

  7. Findings from dialogue with evaluation managers: the EHA market • Main constraint to improved evaluation quality is agencies accessing available evaluators with appropriate skills • Does the EHA market need further regulation?

  8. Mainstreaming of the Quality Proforma • By ECHO to revise tor (lesson learning, protection, identification of users, prioritisation, time frame and users of recommendations etc) • DEC Southern Africa evaluation (rated 7 agency reports) • Groupe URD (for planning of evaluations)

  9. Findings from the Quality Proforma 2003-2004 • Significant improvement in use of DAC criteria, although efficiency and coherence still problematic • Greater attention to protection (2002/3 – 6 per cent rates satisfactory or better, 2003/4 32 per cent rated satisfactory)

  10. Findings from the Quality Proforma 2003-2004 • No improvement in appropriateness of evaluation methods use, vis a vis good practice • Limited improvement in primary stakeholder consultation (13% satisfactory or better in 2002/3; 20% in 2003/4). • Most other QP areas fairly similar to 2000-2002 average • Greater attention to HIV/AIDS

  11. Good practice examples • Follow-up • Consultation with primary stakeholders • Socio-economic analysis • RTE • Evaluation of efficiency • Protection

  12. Next steps • Agencies valued interaction and dialogue with meta-evaluators, this should continue. • Eg internal/external rating by agencies and meta-evaluators using slimmed down Quality Proforma (mainstreaming). • Eg interaction between non-agency evaluators and meta-evaluators.

  13. Next steps • Is work needed on the EHA market, eg bringing in evaluators from the south • Best format for proceeding – working group on evaluation quality?

More Related