1 / 35

Management Response to the Biennial Report on Evaluation 2012 DP/FPA/2012/8

Management Response to the Biennial Report on Evaluation 2012 DP/FPA/2012/8. June 2012. Outline. Overall management opinion Areas of agreement Areas of concern Key DOS findings on progress in evaluation Management concern #1: Narrow scope of the report

Download Presentation

Management Response to the Biennial Report on Evaluation 2012 DP/FPA/2012/8

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Management Response to the Biennial Report on Evaluation 2012DP/FPA/2012/8 June 2012

  2. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS findings on progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of Independence

  3. Overall management opinion • Welcomes the Biennial Report on Evaluation 2012 and thanks the Division of Oversight Services (DOS) for its feedback and critical look at evaluation • Agrees that coverage of decentralized evaluation has improved and quality of evaluation requires further efforts • Questions the validity of the conclusions and recommendations as the report is not comprehensive enough and is based on outdated information to adequately present the status of evaluation in UNFPA

  4. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS finding: Lack of progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of independence

  5. Areas of agreement • The DOS report highlights a number of key areas that management agrees with: • Coverage of decentralized evaluation has improved • Quality of evaluations requires further efforts • Staff capacity in evaluation requires more attention • Significant efforts are underway to improve evaluation

  6. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS finding: Lack of progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of independence

  7. Summary of concerns about scope • DOS biennial report: • covers only DOS Evaluation Branch “achievements” and overlooks significant management efforts and progress in the three key areas of evaluation coverage, use, and quality; and • does not include any substantive evidence generated by evaluations during the biennium

  8. Coverage and use: Significant progress Coverage Use

  9. Use and quality improvement strategy being implemented • Quality of evaluation is not yet as high as desired, but a comprehensive strategy has been developed to improve it: • Developing guidelines and tools • Conducting trainings/webinars • Establishing quality enhancement mechanisms • Institutionalizing knowledge sharing and continuous learning from experience • Improving planning and coordination

  10. Use and quality improvement strategy #1:Developing guidelines and tools • Evaluation Guidelines (2011) • Programme Review Committee established (2011) • UNDG RBM Handbook: UNFPA was co-chair • Consultant briefing package on Evaluation (2011) • RBM Guide and e-learning modules for staff (2010/2011) • Framework for reporting on programme results (2011) • Guidance note on Evidence based Programming (2011) • Guide for developing robust results frameworks for effective programming (updated 2011) • Communities of Practice (fully operational in 2011)

  11. Use and quality improvement strategy #2:Major efforts in evaluation training Country offices covered by evaluation training Staff participation in evaluation training Source: UNFPA Programme Division/ESPB Evaluation Training Data  Most staff (92%) that managed 2011 CPEs had received training

  12. Use and quality improvement strategy #2 (cont’d): Fund-wide efforts in evaluation training

  13. Use and quality improvement strategy #3:Developing evaluation quality enhancement mechanisms • Peer Support Network established • Support provided on an as-needed basis • Review and sign-off of evaluation Terms of Reference by Regional M&E Advisers • 100% of Terms of References reviewed and signed off in 2011 • Formation of Evaluation Management Committees to guide evaluations • 95% of evaluations guided by Evaluation Management Committees in 2011 • Draft evaluation reports reviewed by Regional M&E Advisers • 80% of draft reports reviewed by Regional M&E Advisers in 2011

  14. Use and quality improvement strategy #4:Institutionalizing evaluation knowledge sharing & learning from experience • Management response tracking system established and accessible online • Evaluation policy briefs: Synthesis of evaluation findings • Information materials: ‘Evidence & Action’ series on evaluation topics • Issue 2 - Spotlight on evaluation: UNFPA country programme evaluations • Issue 3 - Country programme evaluations: effective management • Issue 4 - UNFPA country programme evaluations: a progress update • Generation of lessons learnt from evaluation (example: effective management of evaluation) • Harnessing of evaluation consultant experiences for learning purposes

  15. Use and quality improvement strategy #4 (cont’d): Management Response Tracking System

  16. Use and quality improvement strategy #5:Coordination of the evaluation function • Coordination of monthly teleconferences between HQ and ROs • Annual global meetings: joint planning, review of progress, strategy and action plan development • Status updates on evaluation implementation: preparation of progress report on action plan implementation • Coordination of global evaluation trainings • Fund-wide M&E network established (50+ M&E practitioners) • Knowledge sharing about evaluation function: • Evaluation website and M&E net maintenance • Support to DOS in population of evaluation database • Provision of evaluation related input into organizational reporting

  17. Quality improvement results to date:Enormous progress in improving results frameworks Percentage of CPD output indicators with both baselines and targets Bolivia and Cameroon 2012 CPDs went through the programme review committee and were both rated ‘Good’. Source: CPD submitted to the EB annual session 2012, including Cameroon.

  18. Quality improvement results to date:Progress starting to be seen in CPDs Key examples:

  19. Summary of concerns about scope • DOS biennial report: • covers only DOS Evaluation Branch “achievements” and overlooks significant management efforts and progress in the three key areas of coverage, use, and quality; and • the purpose of the report does not adhere to the Executive Board decision

  20. Purpose of the report does not adhere to the Executive Board decision • Executive Board decision 2010/26 requests that “future biennial reports on evaluation address, inter alia, findings and recommendations of evaluations, analysis of the factors affecting quality, and the follow-up to evaluations conducted by UNFPA” • The report does not analyse and present the findings and recommendations of the UNFPA biennial evaluative work to inform the programming and decision making

  21. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS finding: Lack of progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of independence

  22. Unreliable methodology:Report based on outdated information • The report states it covers 2010 and 2011 evaluations, but in fact only looks at CPEs conducted in 2010 and does not include the assessment of any of 30 CPEs conducted in 2011 • Survey on evaluation practice also does not include 2011 CPEs • The findings and conclusions on RBM in the report are based on results frameworks of CPDs developed in 2006/7

  23. Unreliable methodology:Incomparability of assessment results across EQA reports over years • The instability in assessment criteria and the rating used in Evaluation Quality Assessment reports does not allow meaningful comparison over time, meaning that credible conclusions about the trends cannot be drawn

  24. Unreliable methodology:Change in methodology and lack of definitions in 2012 EQA • Methodology changed several times in the current biennium: most importantly, a different weighting system was introduced in May 2011. • The absence of definitions for the different scale values is a major gap and inconsistency in the 2012 EQA report • For details about Methodology, refer to Management Response to DOS Advisory Report, 27 March 2012)

  25. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS finding: Lack of progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of independence

  26. DOS Report of 2010 recommends dividing labor between management and independent evaluation function

  27. DOS Report of 2010 recommends dividing labor between management and independent evaluation function (cont.)

  28. DOS Report of 2010 recommends dividing labor between management and independent evaluation function (cont.)

  29. DOS report of 2012 contradicts its 2010 recommendations • 2012 Report proposes re-merging management and independent evaluation functions: • “The Evaluation Branch should fully exercise its evaluation management function by restoring the necessary link between the accountability and learning dimensions of evaluation, which are artificially split..” • Management has concerns about this: • Conclusion is premature: • Efforts based on the division of labor have just begun and are only now showing results  too early to change since evidence of the effects unknown • Independent review of the evaluation policy is underway • Evidence shows that structure is not panacea: in 2005, when functions were combined EQA showed 88% evaluations unsatisfactory

  30. Outline • Overall management opinion • Areas of agreement • Areas of concern • Key DOS finding: Lack of progress in evaluation • Management concern #1: Narrow scope of the report • Management concern #2: Unreliable methodology • Key DOS findings and conclusions on management of evaluations • Management concern #1: Inconsistent recommendations • Management concern #2: Interpretation of Independence

  31. Independence of evaluation at UNFPA Background on concept of independence of evaluation • Management is following the Evaluation Policy and the advice provided by DOS: all CPEs are conducted by independent third parties • Report alleges that “Evaluations managed outside of DOS do not meet the independence requirement” but does not provide any evidence to support this claim • Management does not think that it is appropriate to change its approach without a change in the policy In 2010 report, DOS advised management that that the use of a third party is one of the mechanisms for guaranteeing independence of a CPE Evaluation policy: “Decentralized evaluations will ensure objectivity and impartiality through a variety of mechanisms built into the evaluation plans, such as the provision for external review experts, advisory committees and the use of independent evaluators.” Management is applying all these mechanisms. (Page 4, Principle d) Status of implementation

  32. Conclusion • There isgrowing evidence that evaluation in UNFPA is improving • The report could have been more helpful for management if its scope were wider and methodology robust enough to reflect the contemporary status of evaluation in UNFPA for the entire period under review

  33. ANNEX

  34. is the only 2011 evaluation covered in the survey referred to in para 8 of the DOS report. It was originally planned for 2010 but delayed because of the earthquake

  35. Thank you!

More Related