1 / 17

June 2011 Update on The Multilateral Effectiveness Initiative

June 2011 Update on The Multilateral Effectiveness Initiative. Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness. History and Context. 2009

ohio
Download Presentation

June 2011 Update on The Multilateral Effectiveness Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. June 2011 Update on The Multilateral Effectiveness Initiative Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness

  2. History and Context • 2009 • Large Task Team on Multilateral Effectiveness was established to explore and further develop proposal to strengthen information on development effectiveness (DE) of Multilateral Organizations (MOs) • A smaller Management Group(MG) (US, UK, SADEV, CIDA, WB, UNEG & MOPAN) was created to undertake the work more easily • 2010 • Methodology and approach were pilot tested (ADB and WHO) under guidance of the MG/Task Team • Draft report submitted to the Network in November • Network provided guidance to the Task Team on completing the pilot test phase 2

  3. Guidance From Network (Nov. 2010) • Task Team was requested to: • Finalize pilot test (ADB and WHO) • Build stronger links with MOPAN and examine complementarity of results with MOPAN 2010 assessments • Refine methodology and guidelines • Further engage with MOs and with all stakeholders • Hold a Management/Steering Group meeting in spring 2011 to take stock and prepare response to the Network on lessons and future steps, taking into account developments by MOPAN 3

  4. Steps Taken Since November 2010 • Technical Team discussed strategies for complementarity and convergence with the MOPAN Technical Working Group • Received MOPAN reports on ADB and WHO and prepared draft paper on complementarity • Workshop of the Technical Team (CIDA, SADEV, DFID with MOPAN representative) to discuss possible revisions to pilot test report, paper on complementarity with MOPAN, and methodology and guidelines • ECG and UNEG invited to participate in the MG, however, they reconsidered their participation and provided their regrets • MG (US, UK, SADEV, CIDA, MOPAN) met and agreed on revisions to pilot test report, paper on complementarity with MOPAN, and methodology & guidelines 4

  5. Pilot Test Report • Focused more directly on the implications of the results for the validity and utility of the meta-evaluation methodology • Clarified that results reported are illustrative and used only to test the approach and methodology • Classified validity of findings for each criteria as illustrated by each pilot case 5

  6. Pilot Test Report Results: An Illustration Using Selected Criteria for ADB Red circles highlight the clustering of findings Validity of Results Based on number of evaluations 6

  7. Pilot Test Report Results: An Illustration Using Selected Criteria for WHO Red circles highlight the clustering of findings Validity of Results Based on number of evaluations 7

  8. Conclusions of the Pilot Test • Proposed approach is workable and can be implemented within estimated time (6-8 months) and resource requirements (USD $125,000) with little burden on MO • Where MO produces adequate volume of evaluation reports over 3-4 year period covering significant portion of investments • Approach works well and results can be generalized about MO DE • Where adequate number of evaluation reports is not available and coverage of activities cannot be estimated • Results on DE are interesting but harder to generalize • At the completion of the pilot test there were opportunities to improve methodology by refining process and instruments 8

  9. Engagement and Complementarity with MOPAN • Continued participation by MOPAN in the MG and dialogue between Technical Team and MOPAN Technical Working Group • Identified MOPAN Key Performance Indicators (KPI) and Micro-Indicators (MI) that can be compared to approach criteria • Compared results of pilot test to survey results reported by MOPAN 9

  10. Complementarity with MOPAN:Illustration of Comparable Pilot Test and MOPAN 2010 Survey Results: ADB High level of concurrence with two exceptions 10

  11. Complementarity with MOPAN Analysis • Where pilot test and MOPAN Survey criteria are comparable (for seven of 19 tested criteria) • Results are often in agreement (e.g. on strength of the evaluation function) • Where results are not in agreement • This can be explained by different time frames and organizational levels examined by each approach (e.g. on the strength of RBM systems) • The two approaches focus on different aspects of MO effectiveness and rely on different information sources • Results are complementary rather than in conflict • Together, they can provide a more complete picture of MO’s overall performance 11

  12. Improving the Approach and Methodology: Key Revisions • Reorganized and focused criteria more directly on development effectiveness • Clarified the process for selection options for strengthening development effectiveness information • Refined quality assurance criteria 12

  13. Revised Methodology: Reorganized Criteria For Assessing Development Effectiveness • The achievement of development objectives and expected results (including policy impacts) • Cross-cutting issues: inclusive development which is gender sensitive and environmentally sustainable • The sustainability of benefits and results achieved • The relevance of MO activities and supported projects and programs • The efficiency of MO operations in support of projects and programs • The use of monitoring and evaluation to improve development effectiveness 13

  14. Revised Methodology: Identifying Scenarios and Options Preliminary Review Review of essential documentation Scenario A MO Reporting on DE is Adequate Scenario B MO Reporting on DE is not Adequate but Evaluation Function is Scenario C MO Effectiveness Reporting and Available Evaluations Inadequate for Reporting on DE Option 1 Rely on MO Reporting Systems Option 2 Conduct a systematic synthesis of information from available evaluations Option 3 Implement actions aimed at strengthening MO Evaluation system and DE reporting Apply the meta-synthesis of evaluation results methodology 14

  15. Revised Methodology: Evaluation Quality Assurance Criteria New Criteria 15

  16. Options for Moving Forward • Recognize that the pilot test has been successful and acknowledge that the methodology is available for use by bilateral, multilaterals or others to apply, as desired • In light of the point above, recognize that others can build on the methodology and guidelines with greater participation by members of MOPAN and individual MOs as further work to assess the effectiveness of MOs are undertaken • Acknowledge that some donors and groups of donors may move forward to apply the methodology (jointly or individually; sequentially or concurrently with MOPAN) • Explore over time how to institutionalize or formalize the capacity and responsibility of assessing DE of MOs with MOPAN or other organizations • Explore further engagement on approaches and vehicles for assessing the DE of MOs with the MOs 16

  17. Questions to Be Addressed • Is the approach and methodology acceptable to the Network? • Can agencies using the approach and methodology indicate it has been endorsed by the Network? • What is the level of interest among member agencies of the Network in leading or participating in reviews of MOs using the approach and methodology (as revised)? 17

More Related