1 / 143

MONITORING & EVALUATION

MONITORING & EVALUATION. Capacity Building for Program Improvement. Global AIDS Program Centers for Disease Control and Prevention  ORC Macro. Training Focus. Program Improvement. Share Data with Partners. Reporting/ Accountability. Training Objectives.

Download Presentation

MONITORING & EVALUATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MONITORING & EVALUATION Capacity Building for Program Improvement Global AIDS Program Centers for Disease Control and Prevention  ORC Macro

  2. Training Focus Program Improvement Share Data with Partners Reporting/ Accountability

  3. Training Objectives • Describe the use of M&E data for program planning, management, and improvement • Understand and initiate program monitoring and other evaluation activities as standard program components within technical strategies • Work towards developing strategy-specific M&E plans and an integrated M&E strategy • Identify M&E TA needs By the end of the training, participants will be able to:

  4. Relationship Between Planning, Implementation, and Outcomes Outcomes Planning Implementation

  5. Ideal Relationship among CAPs/Budget, M&E & Reporting Develop Program Plan & Budget Plan Due Oct. 1 Country Assistance Plan Objectives Due Oct. 1 Logic Model Annual Report Due Dec. 1 Design M&E/ M&E Plan Implement M&E Activities

  6. Relationships Among CAPs/Budget, M&E & Reporting Develop Program Plan & Budget Plan Due Oct. 1 Country Assistance Plan Objectives Due Oct. 1 Logic Model Annual Report Due Dec. 1 Design M&E/ M&E Plan Implement M&E Activities

  7. CDC/GAP Program Model Care and Treatment Surveillance Primary Prevention Capacity and Infrastructure Strengthening • Voluntary counseling and testing (VCT) • Blood safety • STI prevention and care • Youth interventions • Public-private partnerships • Behavior change communication • Preventing HIV transmission in drug-using populations • HIV, STI, and TB disease surveillance and behavioral surveillance • Preventing mother-to-child transmission (PMTCT) • Prevention and treatment of opportunistic infections • TB prevention and care • Palliative care • Appropriate use of antiretroviral drugs • Monitoring and evaluation • Training • Laboratory support • Information systems

  8. CDC HIV Strategic Plan International Objectives: • Decrease sexually transmitted HIV infections • Develop capacity in host countries for HIV prevention & care • Strengthen HIV/STD/TB surveillance • Improve scientific knowledge of HIV and the safety and efficacy of new biomedical interventions • Decrease HIV infections transmitted from mother to child • Increase access to HIV care and support, including prevention & treatment of opportunistic infections • Decrease parentally transmitted HIV infections

  9. Outputs Short-Term Outcomes Intermediate Outcomes NAP Program Program Program Program Program Program Program Program Global Fund inputs World Bank inputs Long-Term Impacts USG inputs Other inputs Multi-agency M&E Logic Model Adapted from Milstein & Kreuter. A Summary Outline of Logic Models: What are They and What Can They Do for Planning and Evaluation? CDC 2000

  10. Program Program Program Program Program Program Program Program Multi-agency M&E Logic Model Short-Term Outcomes Intermediate Outcomes Outputs NAP CDC inputs CDC USAID inputs Long-Term Impacts USG Partnership Agency X inputs Global Partnership Agency Y inputs

  11. Global M&E Indicator Pyramid:Levels of Indicators Global Level Indicators (UNGASS) Country Level Indicators (CDC, NAP/UNAIDS, USAID Missions) Project Level Indicators (MTCT, STI, VCT, TB, Care and Treatment, etc.)

  12. Strengthen M&E systems that inform HIV/AIDS policy and program decisions at the local, national and global levels CDC/GAP M&E Mission Statement

  13. CDC/Global AIDS ProgramMonitoring and Evaluation Goals The CDC/GAP M&E system has two goals: • Determine the progress and effectiveness of CDC/GAP programs and assistance activities • Strengthen the capacity of National AIDS Programs to conduct monitoring and evaluation

  14. Global AIDS Program M&E Framework and Illustrative Data Types Impact(Long-termEffects) Input (Resources) Assessment & Planning Output (Immediate Effects) Outcomes(Intermediate Effects) Activities ( Interventions, Services) Situation Analysis Response Analysis Stakeholder Needs Resource Analysis Collaboration plans Staff Funds Materials Facilities Supplies Trainings Services Education Treatments Interventions # Staff Trained # Condoms Distributed # Test Kits Distributed # Clients Served # Tests Conducted Provider Behavior Risk Behavior Service Use BehaviorClinical Outcomes Quality of Life HIV Incid/Prev Social Norms STI Incid/Prev AIDS Morb/Mort Economic Impact Program Development Data Population-based Biological, Behavioral & Social Data Program-based Data In addition to monitoring these illustrative data types, select programs conduct enhanced process and outcome evaluations.

  15. 9Critical Elements of GAP M&E Strategy Phase 3 Start-up/Phase 1 Phase 2 Systematic Review Case Studies, Operation & Intervention Research, Economic Evaluations Partnerships CAPs & Logic Models National-Level Impact Monitoring M&E Plans, Annual Reports, & Program Reviews National-Level Outcome Monitoring M&E Needs Assessments & Trainings Program Monitoring & Process Evaluation

  16. A Public Health Questions Approach to Unifying AIDS M&E Are collective efforts being implemented on a large enough scale to impact the epidemic (coverage; impact)?Surveys & Surveillance Determining Collective Effectiveness OUTCOMES & IMPACTS MONITORING Are interventions working/making a difference? Outcome Evaluation Studies Monitoring & Evaluating National Programs OUTCOMES Are we implementing the program as planned? Outputs Monitoring OUTPUTS What are we doing? Are we doing it right? Process Monitoring & Evaluation, Quality Assessments ACTIVITIES What interventions and resources are needed? Needs, Resource, Response Analysis & Input Monitoring INPUTS Understanding Potential Responses What interventions can work (efficacy & effectiveness)? Are we doing the right things? Special studies, Operations res., Formative res. & Research synthesis What are the contributing factors? Determinants Research Problem Identification What is the problem? Situation Analysis and Surveillance

  17. Strategic Planning for M&E: Setting Realistic Expectations Monitoring and Evaluation Pipeline All Most Some Few * # of Projects Impact Monitoring/Evaluation Outcome Monitoring/ Evaluation Process Evaluation Input/Output Monitoring * Supplemented with impact indicators from surveillance data. Levels of Monitoring & Evaluation Effort Adaptation of Rehle/Rugg M&E Pipeline Model, FHI 2001

  18. Strategic Planning for M&E: Setting Realistic Expectations All Most Some Few * Phase 1: Typical Methods • Situational analysis • Response analysis, stakeholder needs & resource analysis • Inputs/outputs monitoring (e.g., # staff trained, #condoms distributed, # clients served)

  19. Strategic Planning for M&E: Setting Realistic Expectations All Most Some Few * Phase 2: Typical Methods • Process evaluation (e.g., quality of training, client satisfaction or perceptions) • Quality assessments • Operations research & formative evaluation • Case study • Cost analysis

  20. Strategic Planning for M&E: Setting Realistic Expectations All Most Some Few * Phase 3: Typical Methods • Monitoring outcome indicators (e.g., increase in condom use, increase in knowledge about HIV transmission) • Outcome evaluation (e.g., was the program responsible for behavior change)

  21. Strategic Planning for M&E: Setting Realistic Expectations All Most Some Few * Phase 4: Typical Methods • Impact monitoring (e.g., disease surveillance) • Impact evaluation (e.g., rise or fall of disease incidence/prevalence as a function of AIDS programs)

  22. International M&E Standards • Monitoring and Evaluation Reference Group (MERG) • UNAIDS, since 1998… • International gold standard for indicators (now sharing with WHO) • UNGASS- United Nations General Assembly Special Session • Global commitment embodied in accepted national indicators to be tracked and reported at global level • CRIS (Country Response Information System) • USG support • USAID-DHHS financial and technical assistance (leading funder) • National M&E trainings and workshops • These efforts support the crucial link between standardized indicators and standardized data collection methods/ mechanisms at the country level

  23. International M&E Standards: Harmonized Tools and Methods Guidance • UNAIDS, WHO, UNICEF, USAID, HRSA, GFTAM, World Bank are major CDC/GAP partners in development of Indicators and M&E guidance to national governments • New indicator and M&E guidance on: • VCT • PMTCT • Care and Support • ART • Orphans

  24. UNAIDS and UNGASS Indicators • National indicators; subset are globally tracked • Core national program indicators with focus on: • Coverage (% population “at risk” getting intervention or responding to intervention) and • Impact: HIV infection, morbidity, mortality

  25. What Is Monitoring & Evaluation? • Monitoring: • Tracks priority information relevant to national program planning and intended outputs, outcomes and impacts. • Tracks costs & program functioning. • Provides basis for program evaluation when linked to a specific program. • Evaluation: • Is a rigorous, scientifically based collection of information about program activities, characteristics, and outcomes to determine the merit or worth of a specific program. • Is used to improve programs and inform decisions about future resource allocations.

  26. M&E Terminology • Assessment & Planning: • Collection of information and data needed to plan programs and initiatives. These data may describe the needs of the population and the factors that put people at risk, as well as the context, program response, and resources available (financial and human). • Answers questions such as: • What are the needs of the population to be reached by the program/initiative? • How should the program/initiative be designed or modified to address population needs? • What would be the best way to deliver this program/initiative?

  27. M&E Terminology (continued) • Input/Output Monitoring: • Collects data describing the individuals served, the services provided, and the resources used to deliver those services. • Answers questions such as: • What services were delivered? What population was served and what numbers were served? What staffing/resources were used? • Process Evaluation: • Collects more detailed data about how the intervention was delivered, differences between the intended population and the population served, and access to the intervention. • Answers questions such as: • Was the intervention implemented as intended? Did the intervention reach the intended audience? What barriers did clients experience in accessing the intervention?

  28. M&E Terminology (continued) • Outcome Monitoring: • Basic tracking of measures related to desired program outcomes. With National AIDS programs, outcome monitoring is typically conducted through population-based surveys to track whether or not desired outcomes have been reached. May also track information directly related to program clients, such as change in knowledge, attitudes, behavior. • Answers the question: • Did the expected outcomes occur, e.g., increase in condom use; increase in knowledge or change in behavior; increase in client use of services?

  29. M&E Terminology (continued) • Outcome Evaluation: • Collects data about outcomes before and after the intervention for clients as well as with a similar group that did not participate in the intervention being evaluated. • Answers the question: • Did the intervention cause the expected outcomes?

  30. M&E Terminology (continued) • Impact Monitoring and Evaluation: • Collects data about HIV infection at the jurisdictional, regional, and national levels. • Answers the question: • What long-term effects do interventions have on HIV infection? Distinction between Impact Monitoring and Evaluation • Impact monitoring (e.g., disease surveillance). • Impact evaluation (e.g., rise or fall of disease incidence/prevalence as a function of AIDS programs).

  31. M&E Compared to Other Concepts • Academic Research: Primarily, hypothesis testing in a controlled environment. • Disease Surveillance: Ongoing systematic collection, analysis, and interpretation of data that describe diseases and their transmission in populations. • Operations Research/Evaluation: Applies systematic research techniques to improve service delivery and influence related program policies. • Policy Evaluation: Assessments of application and effectiveness of policies. • Economic Evaluation: Assessments to identify, measure, value, and compare the costs and outcomes of alternative interventions.

  32. Other Notes on M&E Language Case Studies: A methodological approach that typically incorporates a number of data-gathering activities (e.g., interviews, observations, and questionnaires) at select sites or programs. In the GAP context, case studies are done at the country level to determine CDC’s overall “value added”. The findings are then used to report to stakeholders, make recommendations for program improvement, and for sharing lessons with other countries.

  33. M&E Related to Planning, Implementation and Outcomes Outcomes Planning Implementation -Assessment & Planning -Input/Output Monitoring -Process Evaluation -Outcome Monitoring -Outcome Evaluation -Impact Monitoring -Impact Evaluation

  34. Implications of Not Knowing How an Intervention Was Implemented Planning ??????? Outcomes

  35. CDC/GAP Field Office M&E Activities “GAP M&E Racetrack” • In field/regional-office groups discuss M&E activities that are planned, initiated, completed, institutionalized or ongoing. • Complete the racetrack by placing dots in appropriate boxes. • Choose a field/regional-office spokesperson to present a 10 minute overview of your M&E activities.

  36. Goals and Objectives • Goal: Statement of a desired, broad, long-term outcome of • the program; expresses general program intentions • and helps guides the program’s development. • Objective: • Statement of desired, specific, reasonable, and measurable • program results.

  37. M&E Training Goal and Objectives • Goal of training: • To equip GAP staff with an understanding of M&E and the knowledge and skills needed to incorporate M&E activities into everyday work with programs. • Training objectives: • By the end of this training, participants will be able to: • Describe the use of M&E data for program planning, management, and improvement. • Understand and initiate program monitoring and other evaluation activities as standard program components within technical strategies.

  38. A Public Health Questions Approach to Unifying AIDS M&E Are collective efforts being implemented on a large enough scale to impact the epidemic (coverage; impact)? Determining Collective Effectiveness OUTCOMES & IMPACTS MONITORING Are interventions working/making a difference? Monitoring & Evaluating National Programs OUTCOMES Are we implementing the program as planned? OUTPUTS What are we doing? Are we doing it right ACTIVITIES What interventions and resources are needed? INPUTS Understanding Potential Responses What interventions can work (efficacy & effectiveness)? Are we doing the right things? What are the contributing factors? Problem Identification What is the problem?

  39. Process and Outcome Objectives Output: # Clients tested for HIV receive test results. Objective: By the end of the first program year, 98% of clients tested for HIV will receive test results. Outcome: Clients (HIV+ and HIV-) develop and adhere to personalized HIV risk-reduction and treatment strategy. Objective: By the beginning of the second program year, 65% of clients receiving HIV test results will have formed personalized risk-reduction/treatment strategies.

  40. Sample Objectives • Process Objective: • Provide VCT clients with HIV test results. • Outcome Objective: • Assist VCT clients with developing personalized risk-reduction and treatment strategies.

  41. SMART Method • Specific:Identifies concrete events or actions that will take place. • Does the objective clearly specify what will be accomplished and by how much? • Measurable:Quantifies the amount of resources, activity, or change. • Is the objective quantifiable? • Appropriate:Logically relates to the overall problem statement and desired effects of the program. • Does the objective make sense in terms of what the program is trying to accomplish? • Realistic:Provides realistic dimension that can be achieved with available resources and plans for implementation. • Is the objective achievable given available resources and experience? • Time-based:Specifies a time within which the objective will be achieved. • Does the objective specify when it will be achieved?

  42. Examples of SMART Objectives • Process Objective: By the end of the first program year, 98% of clients tested for HIV will receive test results. • Outcome Objective: By the beginning of the second program year, 65% of clients receiving HIV test results will have developed and adhered to personalized risk-reduction/treatment strategies. • Specific: Does the objective clearly specify what will be accomplished and by how much? • Measurable: Is the objective measurable? • Appropriate: Does objective make sense in terms of what the program is trying to accomplish? • Realistic: Is the objective achievable given available resources and experience? • Time-based: Does the objective specify when it will be achieved?

  43. Examples of Objectives • Improve use of TB register data (as measured by decrease in the transfer-out rate, interruption rate & missing data, as well as improved validity of data) in 8 of 9 provinces by March 31, 2004. ______________________________________________________ • Pilot and evaluate a Window-based Electronic TB Register (ETR) in 1 province by 3/31/03. • Decrease the transfer-out rate to < 7% in 8 provinces within 12 months of implementation of ETR (baseline 2000, 11.1%). • Decrease the interruption rate to <10% in districts within 12 months of implementation of ETR (baseline 2000, 15.1%). • Decrease missing data to less than 10% within 12 months of implementing ETR. • Conduct TA visits in 8/9 provinces and support implementation of the ETR particularly regarding validation and use of data.

  44. Small Group Work on Writing Objectives “Instructions” • Develop and/or revise CAP objectives for a minimum of two technical strategies (and complete as many technical strategies as possible). • Link specific activities to CAP objectives for as many technical strategies as possible. • Note: It may be easier to work through one technical strategy at a time (e.g., develop CAP objectives for VCT and link specific activities to these objectives before moving on to another technical strategy). * Field/regional offices will have an opportunity to report back on their experience writing CAP objectives during the country/region debrief, Wednesday 2:45-3:45.

  45. A logic model describes the main elements of a program and how they work together to prevent HIV in a specific population. This model is often displayed in a flow chart, map, or table to portray the sequence of steps leading to program outcomes. Logic Model Definition

More Related