380 likes | 435 Views
Learn key elements in decision-making, program monitoring versus evaluation, and the uses of strategic information in this workshop. Enhance your ability to identify barriers and ways to overcome them.
E N D
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa
Learning Objectives By the end of the session, participants will be able to: • Identify basic purpose of monitoring and evaluation (M&E) • Understand the components and uses of strategic information • Identify key elements in decision-making process • List common barriers to using M&E data for decision-making and ways to overcome them
Monitoring versus Evaluation MONITORING = • Tracking changes in program performance over time EVALUATION = • Attributing program outcomes to their causes
Illustration of Program Monitoring Program indicator Program start TIME-> Program end
Illustration of Program Impact With program Change in program outcome Without program Program impact Program start TIME-> Program end
Purpose of Monitoring and Evaluation The purpose of monitoring and evaluation is to measure program effectiveness.
What Information Do Decision Makers Need M&E Data to Provide? • Process: • Was the program carried out as planned? • How well was it carried out? • Results: • Did the expected change occur? • How much change occurred? • Impact: • Is the change attributable to the program? • Does the change mean program “success”?
Data Versus Information Can be used interchangeably, but: • Data often refers to raw data, or unprocessed information • Information usually refers to processed data, or data presented in some sort of context
Definition of Strategic Information • Monitoring & Evaluation • Evidence-based Research • Surveillance • Management Information System (MIS) • Routine Health Information System (RHIS)
Uses of Strategic Information • Strategic Information is the foundation upon which all planning and program design decisions are based. • Strategic information facilitates program improvement, evaluates progress, and ensures policy compliance.
Strategic Information & Program Life Cycle ASSESSMENT What is the nature of the (health) problem? 1 EVALUATION How do I know that the strategy is working? How do I judge if the intervention is making a difference? STRATEGIC PLANNING What primary objectives should my program pursue to address this problem? 2 5 3 4 DESIGN What strategy, interventions and approaches should my program use to achieve these priorities? MONITORING How do I know the activities are being implemented as designed? How much does implementation vary from site to site? How can be program become more efficient or effective?
What is Decision-Making? • The process of choosing from among various alternatives using information
TECHNICAL APPOACH SYSTEMS APPOACH INDIVIDUAL BEHAVIOR What Determines Utilization of SI for Decision-Making? CULTURE SOCIETY POLITICS
Understanding Decision-Making Decision makers Decisions
Understanding Decision-Making Sector Health Education Commercial Function Policy Planning Budgeting Service delivery Advocacy Evaluation Level National Regional Local Decision makers
Types of Decisions Policy-making … Work plan development … Resource allocation Budget Human resources Infrastructure Decisions
Understanding decision-making Why? Who else? Decision makers Decisions How?
Understanding Decision-Making What information? Why? Who else? Decision makers Decisions How?
Use of data and other information for decision-makingoccurs within a given context.
Who is Involved in Decision-Making? Different types of decision makers • View activities from different perspectives • Have different degrees of understanding of the program • Need different information • Want different information • Need or want information at different levels of complexity • Have different intensities of interest
How are Decisions Made? • Rational/Scientific • Political • Routine
When Is Decision-Making Not Political?Submitted Answer to AEA Presidental Address Challenge in 1989 by R. Turpin • No one cares about the program • No one knows about the program • No money is at stake • No power or authority is at stake • No one in the program • And, no one in the program is making decisions about the program, or is otherwise involved in, knowledgeable about, or attached to the program
Class Activity: Data Use & Decision-Making • Why are decisions made in your organization? • Who makse decisions in your organization? • How are decisions made in your organization? • Discuss and present a time when data was used to make decisions... • … and a time when other factors outweighed data utilization in decision-making
Utilization-Focused Evaluation Doing evaluations that are useful and actually used • Evaluations are largely unused • As are research results • New directions in accountability • Increasing demand for professional evaluations • Need to bring use into practice
Standards for Evaluation • Utility - serve practical information needs of intended users • Feasibility - be realistic, prudent, diplomatic and frugal • Propriety - conducted legally, ethically, and with regard to those involved in and affected by the evaluation • Accuracy - reveal and convey technically accurate information
Class Discussion • What are the kinds of situations that pose special challenges to the utilization of M&E data for program decision-making? • What factors should be considered in order to foster the utilization of M&E findings by intended users?
Increasing Use of M&E Data: Ad-hoc Evaluations • Develop realistic recommendations for program improvement • Explore multiple uses of study data • Continuously remind decision makers of findings & recommendations • Share findings & recommendations with broad audiences • Assign evaluation staff to assist in implementing recommendation
Increasing Use of M&E Data: Outcome Monitoring • Provide timely reports • Involve program staff in definition of outcome measures and data collection • Maintain high face validity of outcome data • Demonstrate use of outcome information • Repeat outcome measures on regular basis • Mandate performance monitoring
“Seven Use-Deadly Sins of Evaluators”(per Patton, 1997:54) • Act as the primary decision-makers • Identify vague, passive audiences as users, instead of real people • Target organizers as users, instead of individuals • Focus on decisions, instead of decision-makers • Assume the evaluation’s funding agency is automatically the primary stakeholder • Wait until findings are in to identify intended users and intended uses • Distance themselves from people and politics
Closing Comments Information must be based on quality data in order to be useful Information must be communicated effectively in order to be useful Information must be used to be good M&E
References • Patton, Michael Quinn. 1997. Utilization-Focused Evaluation. Thousand Oaks: Sage Publications.