1 / 70

Strengthening organisational capacities for evaluation of humanitarian action

Strengthening organisational capacities for evaluation of humanitarian action. London 28 th September 2010. Introduction. Evaluation revolution over last decade ALNAP Evaluation Guides Evaluation Research Evaluation Database Meta evaluations. But. Full benefit not being realised

hoang
Download Presentation

Strengthening organisational capacities for evaluation of humanitarian action

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strengthening organisational capacities for evaluation of humanitarian action London 28th September 2010

  2. Introduction • Evaluation revolution over last decade • ALNAP • Evaluation Guides • Evaluation Research • Evaluation Database • Meta evaluations

  3. But........ Full benefit not being realised Not part of the culture of organisations Disconnected Growing scepticism “Nothing changes” eg Haiti

  4. Efforts underway to address this UNEG OECD-DAC Individual agencies – CARE, DFID Not a new issue in the wider evaluation world Eg Patton: Utilisation-Focused Evaluation

  5. Action research Followed other work: Sandison, Peta (2007) “The Utilisation of Evaluations” (London: ALNAP) Literature review Interviews Draft paper and workshop Further interviews, workshops........

  6. Limitations • Work in progress • Mostly only spoken to evaluators • Aim of today: • Endorse/reject/alter the framework • Swap experience • Plan the way forward

  7. Proposed Framework Capacity Area 1: Leadership, culture and structure Capacity Area 2: Evaluation purpose and policy Capacity Area 3: Evaluation Processes and Systems Capacity Area 4: Supporting processes and mechanisms

  8. Capacity Area 1 Leadership, culture and structure

  9. Ensure leadership is supportive of evaluation and monitoring • Leadership is key – interviews and literature • Recruit? • Motivate? • Evaluation champions? • Demonstrate the benefit of evaluation

  10. Leadership • Leadership value evidence from evaluations • Agree 59% Strongly agree 32% (91% comb’d) • Disagree 5% • Strongly disagree 5% • Total Responses: 22

  11. Operational settings • Still positive, but less so for operational settings (Red Cross and NGO more negative) • Agree 64% • Disagree 27% • Strongly agree 9%

  12. Evaluation culture • Needs to be tackled early on • Virtuous circle? • Each organisation unique • Strategy needed • Learn from experience of others • Cultural web

  13. Evidence and data is actively sought to help decision-making at all levels of the organisation • Agree 60% • Disagree 40%

  14. The ‘personal factor’ Go from abstract to real and specific Identify actual primary intended users Exactly what do they want, when? Build ownership – not distant independent judge! Evaluators need people skills!

  15. All evaluations include a stakeholder analysis of intended users • Disagree 60% • Agree 40% • NGO and Red Cross less positive

  16. Organisational structure Central unit? Decentralised evaluations? Reporting to the board?

  17. Structural position of the evaluation unit has tangible impact on emphasis placed on evaluations (e.g. for accountability versus for learning) • Agree 41% • Strongly agree 27% • Disagree 32%

  18. Capacity Area 2 Evaluation purpose and policy

  19. Why evaluate? Lesson-learning or accountability? The key question and the most controversial! Mandated evaluations undercut utility.... Separate out these functions?

  20. My organisation recognises and actively works to resolve the tensions between accountability and learning • Disagree 50% • Agree 36% • Strongly agree 14% • No clear cut differences across org’l type

  21. There is separation of accountability and learning functions: by the department/ individuals involved: • Agree 55% • Strongly agree 14% • Disagree 32% (UN)

  22. Policy Does it exist? Is it tailored for humanitarian action? Do evaluators see it? Does it include reference to utilisation? OECD-DAC criteria? Is more flexibility needed? Policy does not mean practice......

  23. Within my organisation, there is a formal policy relating to the evaluation of humanitarian aid: • Agree 41% • Strongly agree 36% • Disagree 14% • Strongly disagree 9%

  24. This policy is distinct from the evaluation policy of development aid: • Agree 36% • Strongly agree 18% • Disagree 23% • Strongly disagree 23%

  25. The policy makes reference to utilisation and follow-up of evaluations • Agree 41% • Strongly agree 32% • Disagree 18% • Strongly disagree 9%

  26. Timeliness Depends on why being done For learning: need in time to change programmes RTEs are a response to this Need integrating into the programme cycle

  27. Evaluations are specifically timed to meet programme management requirements: • Agree 55% • Strongly agree 18% • Disagree 28%

  28. Quality not quantity Finite capacity for using evaluations Reflection takes time Unused evaluations lower morale and credibility

  29. There is capacity within my organisation to reflect on, absorb and act upon the findings of evaluations • Agree 60% • Disagree 32% • Strongly disagree 9%

  30. Capacity Area 3 Evaluation Processes and Systems

  31. Develop a strategic approach Where are problem areas? Not necessarily appropriate to cover everything SIDA example Where is change most likely? Where is change most needed?

  32. My organisation has developed and implemented a strategic approach to selecting what should be evaluated • Disagree 55% • Agree 36% • Strongly agree 9%

  33. Involve key stakeholders Evaluations are political Need buy-in early in the process Think downward accountability as well as upwards Don’t have too many stakeholders! Reference groups can help

  34. There is a mechanism for involving key stakeholders in the evaluation process: - at the outset of the evaluation: • Agree 50% • Strongly agree 18% • Disagree 32%

  35. There is a mechanism for involving key stakeholders in the evaluation process: - In drawing up the ToR: • Agree 41% • Strongly agree 18% • Disagree 41%

  36. There is a mechanism for involving key stakeholders in the evaluation process: - In commenting upon final lessons and recommendations: • Agree 59% • Strongly agree 18% • Disagree 23%

  37. Develop a range of evaluation tools Unnecessary framework category? Timeliness already covered Brevity linked to dissemination Focus on key issues linked to working out why you are doing the evaluation in the first place

  38. One of the most supported issue! • More evaluation tools are needed within my organisation: • Agree 64% • Strongly agree 18% • Disagree 19%

  39. Mix internal and external staff Outsiders learn the most! (Process use) Expensive and take the learning with them. Don’t understand the nuances

  40. Insiders participate in evaluation teams: • Agree 59% • Strongly agree 18% • Disagree 23%

  41. Mixed teams/insider teams offer advantages compared with teams comprised solely of external evaluators: • Agree 73% • Strongly agree 27%

  42. Technical quality For a good practice review

  43. Dissemination This is a key strategic issue Think about this when commissioning the evaluation Can include targeted 1-1 briefings and a range of products TV Documentaries Themed reports

  44. My organisation has a dissemination strategy for evaluation findings: • Agree 45% • Strongly agree 18% • Disagree 37%

  45. Ensure management response UNDP Evaluation Resource Centre FAO: 2 year reviews of implementation of findings

  46. There is a formal system within my organisation in which managers respond to evaluations findings and recommendations: • Agree 45% • Strongly agree 18% • Disagree 36% (NGOs and Red Cross)

  47. There is follow-up of this response over time to see whether progress in implementing recommendations has been made: • Disagree 50% • Strongly disagree 9% • Agree 32% • Strongly agree 9%

  48. Meta evaluations Very high demand for themed work Identify key benchmarks and run these through a series of evaluations (eg HR in emergencies) Corroborate findings across different evaluations

  49. Meta evaluations are carried out within my organisation: • Strongly disagree 9% • Disagree 41% • Agree 36% • Strongly agree 14%

  50. Capacity Area 4 Supporting processes and mechanisms

More Related