Monitoring & Evaluation System “Learning to Improve” Making evidence work for development
Contents • Rationale of the M&E • Users and needs • Principles of the M&E • Units of analysis and dimensions of study • Elements of the M&E • Products of the M&E • M&E share of responsibilities • M&E constraints • M&E linkages: KM, decision making and learning
Rationale: Why should the MDGF have a M&E system? • It is a requirement included in the legal agreement between the Donor (Spain) and UNDP • It is an obligation, included in all signed joint programs • It is necessary: if we want, scaling up programs into policies and spread solutions to achieve MDGs at Global level • It is useful: As it is part of the program managing cycle and is the best way to measure progress, detect problems, correct them, improve performance and learn at local and global level
Who are the users of the M&E system what are their needs in terms of Information?
For 5 minutes discussion:Are we missing anyone?Are most of the views of stakeholders included?This is an attempt to let your tacit knowledge flow
What kind of principles should the MDGF’s M&E system incorporate? PRINCIPLES
What kind of principles should the MDGF’s M&E system incorporate? • Accordance with UNEG and DAC/OECD standards • Oriented to well balanced learning and accountability purposes • Evidence-based: consistent data, information or knowledge to support judgments and conclusions of monitoring and evaluation
What kind of principles should the MDGF’s M&E system incorporate? • Built on an aggregation scheme: Elements of M&E (indicators + evaluations, etc) in lower levels add up in higher level of inquire • Measure (DELTA), describe, analyze understand the object of study (JP+C+W+MDGs) and use results to improve program and policy performance
What dimensions should the MDGF M&E system cover?What questions should it answer?
1st M&E Level Joint programs 2nd M&E Level Countries 3th M&E Level Windows 4th M&E Level MDG Achievement Fund • Monitoring Aspects • -Results (UNDAF+PRS) • Processes (Coordination) • Evaluation Dimensions • Induced effects on: • MDG at country level • Other Development indicators • UN system Coordination • Delivering as one • Ownership • Alignment • Harmonization • MFDR • Mutual Accountability • UN country pilots • Monitoring Aspects • Input-Products-Results-Processes • Evaluation Dimensions • Quality of the program formulation • Program objectives attained • Contribution to MDGs & • other development indicators, gender • Replication: Scale up • Innovation • UN system Coordination • Delivering as One • Ownership • Alignment • Harmonization • MFDR • Mutual Accountability • Delta change, effects in • citizens’ life • Monitoring Aspects • -Results as an aggregate of the JP • HR/Gender/Environment • Evaluation Dimensions • New themes:: Culture & migration • Induced effects on: • MDG at country level • Other Development indicators • (Peace and Culture) • UN system Coordination • Delivering as one • Ownership • Alignment • Harmonization • MFDR • Mutual Accountability • Monitoring Aspects • -Results • Evaluation Dimensions • Quality of partnership Spain/UNDP: • Added Value as a mechanism to • Progress MDGs achievement • The Secretariat’s role and added value • Induced effects on: • Linkage of windows and MDGs • Other Development indicators • UN system Coordination • Delivering as one • Ownership • Alignment • Harmonization • MFDR • Mutual Accountability • Effects on citizens' life Joint Programs Countries Culture & development Gender Equality Women’s empowerment Conflict prevention & Peace Building Joint Programs Children Food Security and Nutrition Countries Economic democratic Governance Youth, Employment and Migration Environment and Climate Change Joint Programs Countries Development and the Private Sector
Please in 5 minutesWhat are the 3 most burning questions an M&E system should it answer?
What elements comprise the MDGF M&E system • Monitoring indicators: to measure progress and trends in the short and medium-term at the (inputs, activities, outputs, outcomes) • Field visits: to monitor JP in depth and prepare and manage evaluations as well as disseminate results and provide feedback from recommendations • Evaluations: to review programs and value the worth of the dimension of study JP, at country level, in thematic windows and the MDGF as a whole • Meta-evaluations: to review the quality of the evaluations (JP + Country) conducted and produce robust evidence at window level to link these evidence to MDG achievement • Desk reviews and data collection & analysis: from a variety of sources to contribute with information and knowledge to the M&E+KM system
What products will the MDGF M&E system create and offer? • Field monitoring reports • Country monitoring reports • Mid-term evaluations reports • Impact evaluation reports • Country evaluations (case studies)reports • Meta-evaluations reports • In depth review reports • MDGF global reports (midterm + final) • Special activities under the M&E+ Information & Advocacy focus country initiative
What are the constraints we would face in such a enterprise? budget, time, data, political constraints • What level of involvement and what work load can the staff from the JP take? • Organizational issues, Who is responsible for what part of the system? • Joint program, for real? Do joint programs have the necessary coordination mechanisms to allow for joint monitoring and evaluation? • Will the users of the M&E system willing to collaborate to implement it ? • Is data available at a reasonable cost and on time? • Will we have the political support to implement it?
M&E linkages: KM, decision making and learning • M&E is an enormous source of knowledge and will generate an extraordinary amount of valuable explicit and tacit knowledge as well as organizational and general knowledge that will have to be delivered at the right moment, to the right persons in the adequate format for the purposes needed. • KM system could also become a source of data, information and knowledge for M&E system, is a reciprocal 2 way relation. • The ultimate goal of evaluation is recommend actions to improve decision making and improve performance of programs and policies so all evaluations should focused on utilization of their findings and recommendations