1 / 85

VCC Extractive Industries Executive Training Matthew Harris Sehrish Bari Shira Mitchell

Designing and Implementing Effective Monitoring and Evaluation Systems. VCC Extractive Industries Executive Training Matthew Harris Sehrish Bari Shira Mitchell June 18, 2014. Objectives of Session:. 1) To introduce the concepts of an M&E system. Program Cycle. Logical Framework.

norman
Download Presentation

VCC Extractive Industries Executive Training Matthew Harris Sehrish Bari Shira Mitchell

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing and Implementing Effective Monitoring and Evaluation Systems VCC Extractive Industries Executive Training Matthew Harris Sehrish Bari Shira Mitchell June 18, 2014

  2. Objectives of Session:

  3. 1) To introduce the concepts of an M&E system

  4. Program Cycle

  5. Logical Framework • Logical Framework Approach (LFA): A general approach to program planning, monitoring and evaluation that encourages consideration of relationships between resources, activities, and desired changes/results. • Components of Logical Framework: • Inputs (Resources) • Processes (Activities) • Outputs (Results) • Outcomes (Objectives) • Impacts (Goal)

  6. Simplified Logic Model • Logic Model depicts the logical framework components as a linear process • Causal thinking (IF -> THEN) • Input -> Activity -> Output -> Outcome -> Impact • Example: What would a simplified logic model look like for an educational program that can lead to reduction of HIV/AIDS transmission? • Goal: Reduction of HIV/AIDS transmission • Objective: Increasing knowledge in community about HIV/AIDS

  7. What Do We Mean by M&E? • Set of procedures & analytical tools to examine • how programs are conducted (inputs & activities) • their level of performance (outputs) • whether they achieved what they were intended to achieve (outcomes & impact)

  8. Monitoring: What are we doing? Tracking inputs and outputs to assess whether programs are performing according to plans; collecting routine data to measure progress toward achieving goals and objectives (e.g., people trained, materials distributed) Evaluation: What have we achieved? Assessment of impact of the program on population, community or industry (e.g., use of environmentally sustainable approaches, economic growth) Monitoring vs Evaluation

  9. Monitoring • Is a continuous systematic process of collecting, analyzing and using information to track the efficiency of achieving program goals and objectives • Provides regular feedback that measures change over time in any of the program components such as costs, personnel and program implementation • An unexpected change in monitoring data may trigger the need for a more formal evaluation of activities

  10. Illustration of Program Monitoring Program indicator TIME Program end Program start

  11. Key Monitoring Questions • Is the program being implemented as planned? • Are things moving in the right direction? • Which program activities were more (or less) important/effective in reaching the desired immediate outputs?

  12. Evaluation • Is a systematic process limited in time of collecting, analyzing and using information to assess the effectiveness, relevance and impact of achieving your program’s goals. • Requires study design; some evaluation methods require a control or comparison group; often measurement over time. • Often involves measuring changes in knowledge, attitudes, behaviors, skills, community norms, utilization of services or approaches, & long term indicators of “success” at population, community or industry level • Provides feedback that helps programs analyze the consequences, outcomes and results of its actions

  13. Key Evaluation Questions • Did the program achieve its objectives? • Did the target population benefit from the program? • At what cost? • Can improved social, environmental, agricultural, economic etc outcomes be attributed to program efforts? • Which program activities were more (or less) important/effective? • What would have happened in the absence of the program? • How can we know or measure this (the counterfactual)?

  14. Monitoring vs. Evaluation Monitoring Evaluation Tailored and nuanced Structured around key indicators and evaluation questions Can give a more in depth understanding Can include qualitative measures • Repeat measures • Consistent measures structured around key indicators • More “bare bones” • Quantitative

  15. Indicators How would we know if we achieved success? What would it look like? Indicator: A quantitative or qualitative variable (something that changes) that provides a simple and reliable measurement of one aspect of performance, achievement or change in a program or project. Indicator of a country’s wealth = GDP

  16. What Role do Indicators Play in M&E? • Reduce large amount of data down to its simplest form • Measure program or project progress towards targets and desired outcomes • Measure trends over time • Provide a yardstick whereby organizations, facilities etc. can compare themselves to others doing similar work • Provide evidence for achievement (or lack of) of results and activities • Process Indicators vs. Outcome Indicators vs. Impact Indicators

  17. Program Components as They Relate to M&E Indicators Outcomes Impact Planning Inputs Processes/Activities Outputs Outcome Monitoring Outcome Evaluation Impact Evaluation Input/Output Monitoring Assessments Process Evaluation Process Indicators Impact Indicators Outcome Indicators

  18. Indicator Characteristics • Specific • Measurable • Achievable • Relevant • Time-bound

  19. Collecting data for indicators: • Existing records (archival data) • Verbal data – individual and group interviews • Surveys - written questionnaires • Observations; reports from trained observers • Participation logs • Mechanical measures or apparatus (e.g. water purity; HIV tests; etc.) • Standardized tests • Site visits (Kusek & Rist, 2004)

  20. Millennium Development Goals • Goal 1: Eradicate extreme poverty and hunger • Goal 2: Achieve universal primary education • Goal 3: Promote gender equality and empower women • Goal 4: Reduce child mortality • Goal 5: Improve maternal health • Goal 6: Combat HIV/AIDS, malaria, and other diseases • Goal 7: Ensure environmental sustainability • Goal 8: Develop global partnership for development

  21. The Millennium Development Goals: Select Indicators 21

  22. Why Monitor & Evaluate? • To make decisions about project management and service delivery • To ensure effective and efficient use of resources and provide accountability to donors/funders and community • To assess whether the project has achieved its objectives - has the desired effects • To learn from our activities, and provide information to design future projects Purpose of M&E Assessing Impact Reporting/ Accountability/ Transparency Policy & Advocacy

  23. Developing an M&E Plan

  24. M&E Action Plan – Definition Comprehensive document that describes all: Program objectives, interventions developed to achieve these objectives, & procedures to be implemented to determine whether or not the objectives are met Expected results of the program and how they relate to goals and objectives Data needs, how they will be collected & analyzed Information use, including resources needed to do so How the program will be accountable to stakeholders

  25. M&E PlanSuggested Outline 1.) Introduction 2.) Description of the overall program- including problem statement and framework(s) 3.) Indicators- including definitions (presented in indicator matrix and/or indicator reference sheets: very detailed!) 4.) Data sources and reporting systems (including management/roles and responsibilities) 5.) Plans for demonstrating program outcome/impact 6.) Plans for dissemination and use of information 7.) Analysis of data quality constraints & potential solutions 8.) Implementation plan (aka- M&E action plan or road map-should include budget and timeline)

  26. Elements of an M&E Plan: Introduction (1) Purpose of the plan Description of how it was developed Stakeholders involved Consensus process

  27. Elements of an M&E Plan: Program Description (2) Problem Statement What is the nature of the development issue/public investment program being addressed? Conceptual Framework Goal and Objectives What is the ultimate outcome of the program (goal) What are the shorter-term aims (objectives) Program Description: Interventions Geographic scope Target population, community, industry, etc. Duration Logical Framework(Logic Model)

  28. Elements of an M&E Plan: Indicators (3) Selection of indicators based on: Conceptual and logic frameworks Strategic information needed for decision-making at appropriate level (country/state/local) Donor or funder, government, stakeholder requirements Existing data Funding (available and dedicated to M&E) Presented in two ways: Indicator Matrix- a table presenting indicators including information on data source, frequency and who is responsible Indicator Reference Sheets- detailed sheets describing each indicator, how to measure it, underlying assumptions & interpretation considerations (may be included as appendices)

  29. Elements of an M&E Plan:Data Sources and Reporting Systems (4) Sources of data for indicators Diagram of data collection, processing, analysis, and reporting system Data collection tools Records or registers Survey instruments Commodity management forms (materials distributed) Others? Management Roles and responsibilities of each group/member of the M&E system Data Flow

  30. Elements of an M&E Plan: Plan for demonstrating program outcome/impact (5) A methodology for measuring program outcome or impact (the evaluation, to be explained later in presentation) Protocols for special studies E.g. Nutrition assessment of children under five in development project

  31. Elements of an M&E Plan: Dissemination and Use Plan (6) Clearly defined users Databases for information storage Dissemination methods Reports (schedule and audience) Media outlets (interviews, press releases, etc.) Speaking events Community feedback Others?

  32. Elements of an M&E Plan: Data Quality Assurance (7) Describe known constraints to data quality and/or system performance What will be done to address these constraints?

  33. Elements of an M&E Plan: Implementation Plan (8) Assessment of feasibility to implement plan A detailed work plan for the M&E Plan to include: Each M&E activity (including update of M&E Plan) Timing of each activity Party responsible for each activity Budget necessary for each activity

  34. M&E Costs to Consider How much will it cost? Benchmark 5-10% of total program budget (rough estimate) Need to budget: Costs of information systems (costs of data collection, processing, and analyzing) Costs of information dissemination and use Costs of the data quality control system Costs of coordination and capacity building Human Resources for M&E (coordinator, data collectors, analysts)

  35. Developing & Implementing an M&E Plan: Dos and Don’ts Do: Start early Involve stakeholders at all stages in the process Assess current capacity and use what is already available Avoid duplication of data collection and reporting Do not: Collect information that will not be used Underestimate the importance of stakeholder buy-in and ownership at every juncture

  36. Monitoring and evaluation is important but ineffective if the information is not used. Setting up systems to facilitate M&E use is just as important as the M&E systems designed to collect the data.

  37. 2) To illustrate its application to a large-scale development project (Millennium Villages Project)

  38. Country District Community Infrastructure Business development Agriculture Health Education Design & Implementation of a Multi-Sectoral Project

  39. MVP is divided into sectors, each with a strategy: • Agriculture • Water & Sanitation • Energy • Environment • Technology & Innovation • Gender Equality • Maternal & Child Health • Education • Business Entrepreneurship

  40. Example 1: AGRICULTURE 1. Increasing sustainable agricultural crop production;2. Improving food and nutrition security;3. Farm diversification for income generation; and4. Underpinning sustainability by restoring and conserving the natural resource base.

  41. Example 2: HEALTH • Building and operating clinics and dispensaries to address: • Child malnutrition • Contraception access • Malaria prevention and treatment • Emergency obstetrics services • HIV testing • Free primary health care

  42. Example 3: WATER & SANITATIONProtected spring Borehole

  43. A Millennium Village (Artist’s Rendering)

  44. MVP Chain of Stakeholders Village Community Members --- Local Government --- Site Teams --- MDG Regional Centers --- National Governments --- The Earth Institute at Columbia University --- Millennium Promise (NGO) --- United Nations --- Local Implementing Partners --- External Donors

  45. Overview of the MVP M&E Structure

  46. Evaluates MDG Progress M+E Platform

  47. Outcome Monitoring & Surveys

  48. Outcome Monitoring • Continual and systematic process of collecting and analyzing data to measure the performance of interventions towards achievement of outcomes • Alert managers to problems; ▲accountability • Promote timely performance-based decision-making

  49. OM in the context of LogFrame Implementation Level (more control) • Inputs: resources mobilized to support activities. • Activities / Processes: actions or work to convert inputs into specific outputs • Outputs: converts inputs into tangible outputs • Outcomes: use of outputs by targeted populations (intermediate objective) • Impacts: final object of the program (long-term goal) Results Level (less control)

More Related