1 / 16

Primer on Monitoring and Evaluation

Primer on Monitoring and Evaluation. The 3 Pillars of Monitoring and Evaluation Identifying the Performance Indicators Collecting information using appropriate M&E tools and methods Household surveys Facility Surveys HMIS Quantifiable Supervisory Checklists

amanda-chen
Download Presentation

Primer on Monitoring and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Primer on Monitoring and Evaluation

  2. The 3 Pillars of Monitoring and Evaluation • Identifying the Performance Indicators • Collecting information using appropriate M&E tools and methods • Household surveys • Facility Surveys • HMIS • Quantifiable Supervisory Checklists • Using M & E results for program decisions

  3. Coverage for Routine Immunization Very Low Initial Reach : <20%

  4. Coverage for Routine Immunization Failure to sustain coverage after initial reach

  5. Coverage for Maternal Health Very Low Initial Reach : <35% for AN care

  6. Inequities in Under-five Mortality Rates DHS 2003

  7. Under-five Mortality – Absolute Difference between low and high Nigeria has highest difference in the Region

  8. Principles of M &E • All performance indicators should have Base-line and Targets – NHSDP has them • Should provide data at the required frequency and with adequate disaggregation • Should be able to identify sub groups that are missing out services (Equity)

  9. Principles of M &E • Should use 3rd party assessments for Evaluations • Independence • Less distraction for the program manager • Clearly defined responsibilities for analysis and use of data • Availability of dedicated staff and • Systems/protocols for reviewing and using data • Robust enough to meet the data requirements of RBF /CCT/Contracting which require more precision in measuring results

  10. Where does Nigeria Stand now?

  11. The NSHDP Results Framework in Place Level 4 Institutional Processes Level 3 Service Delivery outputs Level 2 Program Outcomes Level I Health Impact Level 5 Inputs Increase in Federal and State Budgets allocated for health sector (%) Improved retention of Human Resources for Health (%) Public health facilities having active committees (at least 4 meetings per year) that include community representatives (%) Increase in State HMIS reports meeting minimum quality standards (Number Wards meeting the Staffing requirements to deliver minimum package of services (%) Health Personnel receiving competency based training (Number) Health Facilities Renovated/ Rehabilitated (Number) Health Centers receiving supplies of Essential Medicines for ward Minimum Health Package (%) Increase in Children 12-23 months fully immunized (%) Increase in women receiving IPT for malaria during pregnancy (%) Increase in births attended by Skilled providers (%) Improved TB case detection rates (%) Reduction in unmet need for FP services (%) Increase in children under five sleeping under an ITN during the previous night (%) Enhanced condom use at last high risk sex (%) Improved TB Cure rates (%) Increase in contraceptive prevalence rates (%) Reduction in Under 5mortality Rates; Maternal Mortality Ratios and HIV prevalence among 15-24 Year population

  12. Collecting data on NSHDP performance indicators using appropriate M&E tools and methods • Household surveys: • DHS being done once in 5 years – Possibility of Mini DHS in between DHS rounds? • MICs proposed once every 3 years • LQAS being used for Malaria + Program – Scope for using in other programs, but requires capacity building at sub national level • Urgent need for more frequent surveys providing disaggregated data for States/LGAs

  13. Collecting data on NHSDP performance indicators using appropriate M&E tools and methods • Facility Surveys: • Being done under the Malaria Program • Need to develop design, pilot and implement • Quantifiable Supervision Checklists: • Not being done • Will be required with improved results focus • Need to design, pilot and implement • HMIS: • In place • Quality, coverage and timely reporting remain a concern • Requires systems for validation of data

  14. Using M&E results for program decisions • Lot more work still needs to be done • Developing simple tools for annual State/LGA performance ranking • Capacity building at District and LGA levels on decentralized data analysis • Ensuring robust M&E for RBF/Performance Contracting initiatives

  15. Proposed Organization of Session: • Day 3 • Quiz : What we know about M&E • A brief primer on Monitoring and Evaluation • Presentations on different M&E tools and approaches • Day 4 • Introduction to New M&E tools : LQAS • Case Study • Discussion on next Steps on Development of State Results chains and specific actions for putting in place M&E systems for disaggregated data generation and use

  16. Distribution of States by Scores Achieved using Self Administered Questionnaire

More Related