1 / 7

Multi-Donor Evaluation of Support to CPPB Activities in Southern Sudan, 2005 – 2010

Multi-Donor Evaluation of Support to CPPB Activities in Southern Sudan, 2005 – 2010. Evaluative lessons: Improving the conduct and use of evaluations of conflict prevention and peacebuilding activities. Methodological Overview. Stage 1: Conflict Analysis

glenys
Download Presentation

Multi-Donor Evaluation of Support to CPPB Activities in Southern Sudan, 2005 – 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Donor Evaluation of Support to CPPB Activities in Southern Sudan, 2005 – 2010 Evaluative lessons: Improving the conduct and use of evaluations of conflict prevention and peacebuilding activities

  2. Methodological Overview Stage 1: • Conflict Analysis • Policy, Portfolio, and Evaluation Reviews • Terms of Reference for Stage 2 Stage 2: • Fieldwork used to ‘ground-truth’ the working hypotheses and evidence gathered from Stage 1. • Very large number of donor interventions to evaluate ($4 billion; over 2700 projects), so representative sampling was not possible. • The selection of field visits were clustered around ‘working hypotheses’ (e.g. the importance of road construction to peacebuilding), for which selected project sites were identified (e.g. contrasting USAID vs. WFP approaches).

  3. Lesson 1: Embedding the Conflict Analysis in the Evaluation • The OECD guidelines start with an ambitious vision around integrating conflict analysis (and the Utstein palette) but quickly revert back to conventional evaluation. • While there is now greater consensus around how to conduct a conflict analysis, this is still primarily for planning (rather than evaluative) purposes – with a different level of rigour required. • It is a challenge to fully integrate conflict analysis into evaluations. E.g. when moving from Conflict Analysis (more abstract)  to the Evaluation Matrix questions (very definitive) • Lesson: There needs to be a more deliberate step for teams to further integrate and ‘own’ the conflict analysis, and for clients to accept a narrowing/ reduction of the TOR focus & questions.

  4. Lesson 2: Using the Utstein CPPB categories • The Utstein palette provides a useful way of describing conflict prevention and peacebuilding; and for showing that the breadth of CPPB activities goes beyond socio-economic & mediation interventions. • It does not however provide an analytical framework, as there is no underlying conceptual basis (e.g. between the pillars, sequencing, prioritisation, etc). • The CPPB categories can even weaken the analysis if applied too rigidly, particularly as donors themselves do not configure their policies and portfolios around CPPB categories. • Lesson: Emphasis should be placed on the conflict analysis, rather than the CPPB categories, as the analytical tool. Too many different frameworks can weaken the analysis.

  5. Lesson 3: Selecting themes and projects for fieldwork • There is a tension between focusing on donor priorities (e.g. on pooled funds; yet only 20% of funding), and evaluation priorities around CPPB (e.g. security sector reform). • In S Sudan, the selection of fieldwork sites was based on hypotheses drawn from the Stage 1 (conflict analysis, portfolio, policy & evaluation reviews). These were clustered around core themes, rather than individual projects. • Lesson: This approach provided a useful basis to yield policy-level conclusions. But, the selection of evidence can be seen as more selective than a purely project-based approach.

  6. Lesson 4: Verifying Evidence in Conflict-affected Countries • Solid, reliable data is particularly difficult to collate (and collect) in conflict-affected countries. • High staff turnover in many agencies (combined with short evaluation visits) can provide a misleading ‘snapshot’ – and especially so if the history of aid is not fully understood. • Indeed, the ‘hub of knowledge’ may rest with a handful of individuals in any one country. • Lesson: The importance of getting the right balance of country expert views and methodological expertise. A need for methodological innovation where sampling and triangulation of stakeholder views will be insufficient.

  7. Lesson 5: Theories of Change at the policy/ strategic level • Applying a Theory of Change approach to a policy or strategic level is problematic, as most donors do not articulate a clear, evidence-based basis for intervening. • In particular, policies are the product of ‘political’ and institutional processes, with different interested parties, and may be deliberately left open to allow a broad interpretation. • Lesson: The analysis of donor policies is a challenging area: both to the capture differences between actual and de facto policies, and to avoid evaluators assuming too much. In particular, reconstructing the Theory of Change at this level is highly interpretive and open to being challenged.

More Related