1 / 60

United Nations Population Fund RBM-Monitoring & Evaluation Workshop Gaza Prepared by: Rasha Abu Shanab National P

UNFPA – Occupied Palestinian Territory. 4 TH Cycle CPAP (2011-2013) Result Based Management Monitoring & Evaluation Workshop. United Nations Population Fund RBM-Monitoring & Evaluation Workshop Gaza Prepared by: Rasha Abu Shanab National Programme Officer- M&E 03-04 May 2011.

margo
Download Presentation

United Nations Population Fund RBM-Monitoring & Evaluation Workshop Gaza Prepared by: Rasha Abu Shanab National P

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UNFPA – Occupied Palestinian Territory 4TH Cycle CPAP (2011-2013) Result Based Management Monitoring & Evaluation Workshop United Nations Population FundRBM-Monitoring & Evaluation WorkshopGaza Prepared by: Rasha Abu ShanabNational Programme Officer- M&E03-04 May 2011

  2. Result Based Management (RBM) Results-based management “management strategy which ensures its processes, products and services contribute to the achievement of desired results”

  3. Result Based Management • Improve quality of programme delivery • Measure Change • Resources Used effectively & efficiently • Improve transparency • Improve accountability Development Results • Result Based Management • Planning; • Managing; • Reporting • for Results Integrated Monitoring & Evaluation Framework

  4. Operationalize RBM • Identifying measurable changes to be achieved; • Developing Chain of results • Designing activities that will lead to results; • Balancing expected results with the resources available; • Monitoring progress regularly and adjusting activities as needed to ensure that the desired results are achieved; • Evaluating and incorporating lessons leaned into decision-making; and • Reporting on results achieved and contribution to reaching defined goals.

  5. WHY RBM? • Improved focus on results instead of activities • Improved transparency • Improved accountability • Enhanced performance orientation • Improved measurement of programme achievements

  6. Result Based Management • Address the “so what?” question • So what about the fact that outputs have been generated? • So what that activities have taken place? • What are the results that the organization is trying to achieve? • Are they being achieved? • How can achievement be proven?

  7. Constructing a Problem Tree • The main purpose of the problem analysis model is to study the root causes and major effects of problems in order to better design solutions. • Categories for analysis of causes: • Policy constraints, institutional constraints, capacity weakness, social or cultural norms

  8. Causality Analysis Morbidity among pregnant Women Socioeconomic status of the household Anemia among Pregnant Women Improper clinical practices Improper nutrition practices among pregnant women Unavailable Dugs Weak counseling services on nutrition Lack of awareness among community on correct nutrition practices and antenatal care Logistics and supply chain Lack of awareness among women on nutrition and best health practices Weak Management

  9. What is a Result? • A change in a state or condition as a result of a programmatic intervention • Derive from a cause-and- effect relationship .achievement of one result is necessary for, and contributes to, achievement of the other. • A result statement describes what is to be achieved and should clearly indicate the: change, what, target, for whom and time • Three levels of results: • Goal; • outcome; • output.

  10. Result Chain Changes in the lives of people Ex. Reduced Maternal Mortality Long-term effects development impact to which the program contributes at a national level, Long term benefits to beneficiaries Goal/Impact Institutional and Behavioural change Ex.: Increased utilisation of RH services The medium-term effects of an intervention’s outputs. Outcome The products and services which result from the completion of activities within a development intervention. Operational Change Ex.: strengthened capacity of institution, improved awareness and knowledge. Output Means and tasks undertaken to deliver the output (transform inputs to outputs). Ex: Ex. training, equipment provided Activities

  11. Results-Accountability Goal/Impact What the project is contributing towards Increase control What the project can reasonably be held accountable for Outcome What is within the direct control of the project management Output Activities

  12. Group Exercise Define which statement is GOAL, OUTCOME, OUTPUT, ACTIVITY and explain Chain of Result

  13. Monitoring & Evaluation

  14. What is the Purpose of Monitoring? • Tracks programme activities and strategies to identify progress towards the achievement of programme results • To identify actual or potential successes and problems in order to facilitate the adoption of corrective measures during programme implementation. • Enable stakeholders to make informed decisions that will allow them to achieve their development objectives and to demonstrate results

  15. What is the Purpose of Evaluation? • What worked? What did not work? And why? • Identify successful strategies for replication or expansion or modify unsuccessful strategies • verify/improve programme quality and management • To measure effects/benefits of programme • To inform decisions on operations, policy, or strategy related to ongoing or future programmeinterventions • To demonstrate accountability to decision-makers • It is expected that improved decision-making and accountability will lead to better results and more efficient use of resources.

  16. Evaluation

  17. Monitoring & Evaluation System • Formulate outcomes and outputs (RESULTS) • Select outcome and output INDICATORS to monitor • Gather BASELINE information on the current condition, • Where Are We Today? • Planning for Improvement —Selecting specific TARGETS to reach results • Regularly collect DATA to assess whether the targets are being met • Analyze and REPORT results

  18. Indicators • Question: how do we know whether what has been planned is actually happening? • Quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement, to reflect the changes connected to an intervention, or to help assess the performance of a development actor. (DAC, 2002) • To help monitor the progress towards the achievement of the objective • It is tool that we use to assess progress or performance • An indicator is not change itself

  19. Indicators • To clarify the project intervention logic by indicating the target figure to be attained, the timing, the quality sought, and the target groups. • A correct definition of indicators is crucial for any M&E system because by defining the focus for data collection, the indicators provide a map for the monitoring and evaluation activities. • Without a clear set of indicators, monitoring or evaluation activities loose their capacity to assess what is realised against what was agreed and foreseen

  20. Useful Indicators - DOPA • Direct: Closely measure the intended change • Ex. : output “Improved availability to youth friendly health services” • Which indicator is correct and which is wrong? • Quantity of procured equipment for health facilities • # of service delivery points providing youth friendly services • # of Health providers trained in youth-friendly service provision

  21. Useful Indicators- DOPA Criteria • Objective: clear about what is being measured with clear operational definition • Ex: Unmet Need for Family Planning • Operational definition: % of currently married women aged 15-49 years, who do not want any more children during the next two years but are not currently using any method of contraception

  22. Useful Indicators- DOPA Criteria • Practical: reasonable in terms of data collection cost, frequency, and timeliness for decision-making purposes • Adequate: minimum # of indicators necessary to ensure that progress towards output is sufficiently captured • Two indicators per result

  23. Quantitative and Qualitative indicators Quantitative indicators/targets are statistical measures • Number • Percent • Rate (ex. birth rate - Births per 1,000 population) • Ratio (ex. sex ratio – Number of males per number of females). Qualitative indicators/targets imply qualitative assessments • Compliance with • Satisfaction with

  24. Linking Indicators to Results Goal/ Impact: Reduced Maternal Mortality Maternal Mortality Ratio # of complicated pregnancy cases correctly referred and managed according to national protocol Outcome: Increased access and utilization of good quality of EOC Output: Strengthened capacity of health institutions to provide good quality of EOC # of health facilities applying EOC protocol Activity: Training health providers on Emergency Obstetric Care Upgrade Health Centers # of training sessions conducted # of health centers equipped

  25. Indicator Baseline • Baseline data: Data that reflects the situation prior to intervention • Provide context for the setting of targets and capture the situation before a development • intervention begins, or at the beginning of a time period that will be monitored and assessed. • Without baselines, it is impossible for organizations to assess effectiveness and impact. • Baselines and targets capture the same information, but for two different points in time.

  26. Indicators - Targets • Targets are the results expected in the context of the specific programme and within certain time frame. Example of an indicator and targets: • Number of service delivery points (SDPs) per population of reproductive age in each priority district where a package of minimum three types of clinical services and related IEC and counselling activities are offered.

  27. Indicators - Targets • Indicator incorporating a Target • 500 SDPs/1.5 million population of reproductive age in the three district of (names) offer FP, Maternal Health and STI preventive and curative services as well as related interpersonal counselling, group communication activities and information materials by 2006.

  28. Indicators -Means of Verification (MOV) • MOV is the data that is needed to determine the value of the indicator • MOV can be collected through: • Review of documentation • Facility observation/Field visits • In-depth interviews • Focus group discussion • Small surveys and Population Surveys • Use existing data when available, such as, Demographic and Health Information system to save effort and time

  29. Means of Verification

  30. Coffee Break

  31. CPAP Result Framework IPs contribution, Full accountability of UNFPA UNFPA & IPs contribution Full accountability of IPs Output Indicators Outcome Indicators Sub-output Indicators Process Indicators By 2013 targets End of Programme cycle Quarterly Monitored Yearly targets

  32. Day Two 25 May 2011

  33. Monitoring & Evaluation Plan • This is the fundamental document that details a program’s objectives, results needed to achieve these objectives, and the procedures to determine whether or not the objectives are met.  • Describes the data needed and how these data will be collected, analyzed to demonstrate results • State how a program will measure its achievements and therefore provide accountability • M&E plan should be created during the design phase of a program.

  34. Monitoring and Evaluation Plan UNFPA M&E Plan includes: • CPAP Planning and Tracking Tool:Annex II CPAP tracking tool FINAL-updated 18 May.doc • Used to ascertain the progress of programme outputs and assess their contributions to outcomes and goals during implementation. • Established during CPAP formulation in consultation with national counterparts

  35. CPAP Planning and Tracking Tool

  36. Monitoring and Evaluation Plan UNFPA M&E includes: • CPAP M&E Calendar: ANNEX III ME Activities Calendar FINAl 16 December.docx • Provides overview of the monitoring and evaluation activities (surveys, existing monitoring systems, evaluations, programme reviews, etc.) during the country programme cycle • Helps to identify data gaps and how and when the provided data will be used

  37. Accountability for Reporting • UNFPA is mandated to report to its executive board on progress on the Strategic Plan and the Fund’s contributions annually. • UNFPA country office reports annually to HQ through the Country Office Annual reports (COAR) • Executive Board has requested that country programme results and performance data over the duration of the programme are made available at the end of the cycle. • The resident coordinator, supported by the UNCT, should report to national authorities on progress made against results agreed

  38. Reporting on Results • Field visits (identify progress towards activities, areas of concerns, any review) • AWP Monitoring Tool • Certificate of Expenditure/ FACE • Standard Progress report • Annual Programme Review (Achievements, progress towards results, lessons learned)

  39. Field Monitoring Visits • Important means of obtaining first hand information on the programme implementation context and processes. • Expected Results: • Issues of concerns to partners and beneficiaries • Implementation bottlenecks could be resolved • Progress of the Annual Work Plan • Progress towards achievement of results • Follow up on Audit Recommendations

  40. Field Monitoring Visits • Field visits should be undertaken to project sites. • Joint monitoring visits with national implementing partners • Field visits should be planned, determine specific issues to be assessed during the visit. • Document the findings in FVM report, share recommendations and follow up on agreed actions. • tEMPLATES\FMV Management checkilst.doc • tEMPLATES\FMV Check list for training sessions.docx

  41. Annual Programme Review • Conduct jointly with implementing partners during the last quarter of the year • To take stock of progress achieved to date in the implementation of all country programme components • Constraining and facilitating factors reviewed • Programme management and coordination issues assessed • Recommendations for improving programme intervention and programme management • Feed into the next year annual work plans

  42. Work Plan Monitoring Tool • Allows implementing partners and the UNFPA country office regularly to review progress made against the annual work plan and towards achieving country programme outputs • Tracks expenditures related to the achievement of outputs and provides cost data for the analysis of performance issues such as efficiency and cost-effectiveness;

  43. Work Plan Monitoring Tool • Tracks risks or assumptions identified in the CPAP, did other risks emerged • Tracks the factors facilitating and constraining programme implementation; • Provides inputs for the preparation of Standard Progress Reports

  44. Work Plan Monitoring Tool • Should be submitted each quarter with the COE before cash advance is released for the next quarter • Submit to Programme officer with a copy to Finance and M&E officer

  45. Standard Progress Report • Provide information on achieved programme outputs and their contribution to country programme outcomes. • Should be prepared in the fourth quarter (early November) of the year as part of the Annual Review Process • Final progress report should be submitted end of programme and should focus on progress achieved throughout the programme cycle including lessons learned an d good practices • Progress reports are submitted to donors • Progress reports are integrated in the Country Office Annual Report submitted to UNFPA HQ in January

  46. Standard Progress Report To maximize the utility and efficiency of reporting, UNFPA reporting must be done about a programme or project’s results, instead of on its activities. • Key Results achieved: Progress towards the CPAP outputs and the national development goals.

  47. Reporting on Results - Examples Reporting on Activities: “The CO supported to Ministry of Youth Affairs and Sports in the establishment, renovation, and refurbishment of youth empowerment centres in 3 districts  But what was the result? Reporting on Results: By supporting the Ministry of Youth Affairs and Sports in the establishment and running of youth empowerment centres in 3 districts, the CO was able to increase youth attendance at life skills workshops by 45%.

  48. Examples of reported results • Is this good reporting of results? • 750 awareness sessions have been conducted which discussed different issues concerning violence against women, gender equality, reproductive health, early marriage x

More Related