1 / 46

RealWorld Evaluation

What’s new in …. RealWorld Evaluation . Working Under Budget, Time, Data, and Political Constraints . 2 EDITION. Session #822 presented by Jim Rugh & Michael Bamberger. Session Outline. Additional evaluation designs A fresh look at non-experimental designs Understanding the context

jaimie
Download Presentation

RealWorld Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What’s new in … RealWorld Evaluation Working Under Budget, Time, Data, and Political Constraints 2 EDITION Session #822 presented by Jim Rugh & Michael Bamberger

  2. Session Outline • Additional evaluation designs • A fresh look at non-experimental designs • Understanding the context • Broadening the focus of program theory • The benefits of mixed-method designs • Evaluating complicated and complex programs • Greater focus on responsible professional practice • Quality assurance and threats to validity • Organizing and managing evaluations • The road ahead (issues still to be addressed)

  3. What’s New inRealWorld Evaluation? Evaluation Designs

  4. Design #1.1: Longitudinal Experimental Design P1 X P2 X P3 P4 C1 C2 C3 C4 Project participants Research subjects randomly assigned either to project or control group. Comparison group baseline midterm end of project evaluation post project evaluation 4

  5. Design #1.2: Longitudinal Quasi-experimental P1 X P2 X P3 P4 C1 C2 C3 C4 Project participants Comparison group baseline midterm end of project evaluation post project evaluation 5

  6. Design #2.1: Experimental (pre+post, with comparison) P1 X P2 C1 C2 Project participants Research subjects randomly assigned either to project or control group. Comparison group baseline end of project evaluation 6

  7. Design #2.2A: Quasi-experimental (pre+post, with comparison) P1 X P2 C1 C2 Project participants Comparison group baseline end of project evaluation 7

  8. Design #2.2B: Quasi-experimental (retrospective baseline) P1 X P2 C1 C2 Project participants Comparison group baseline end of project evaluation 8

  9. Design #3.1: Double Difference starting at mid-term X P1 X P2 C1 C2 Project participants Comparison group midterm end of project evaluation 9

  10. Design #4.1A: Pre+post of project; post-only comparison P1 X P2 C Project participants Comparison group baseline end of project evaluation 10

  11. Design #4.1B: Post + retrospective of project; post-only comparison P1 X P2 C Project participants Comparison group baseline end of project evaluation 11

  12. Design #5: Post-test only of project and comparison X P C Project participants Comparison group end of project evaluation 12

  13. Design #6: Pre+post of project; no comparison P1 X P2 Project participants baseline end of project evaluation 13

  14. Design #7: Post-test only of project participants X P Project participants • Need to fill in missing data through other means: • What change occurred during the life of the project? • What would have happened without the project (counterfactual)? • How sustainable is that change likely to be? end of project evaluation 14

  15. The 7 Basic RWE Design Frameworks

  16. What’s New inRealWorld Evaluation? A fresh look at non-experimental evaluation designs

  17. Non-Experimental Designs [NEDs] • NEDs are impact evaluation designs that do not include a matched comparison group • Outcomes and impacts assessed without a conventional counterfactual to address the question • “what would have been the situation of the target population if the project had not taken place?”

  18. Situations in which an NED may be the best design option • Complex programs • Not possible to define a comparison group • When the project involves complex processes of behavioral change • outcomes not known in advance • Many outcomes are qualitative • Projects operate in different local settings • When it is important to study implementation • Project evolves slowly over a long period of time

  19. Some potentially strong NEDs • Interrupted time series • Single case evaluation designs • Longitudinal designs • Mixed method case study designs • Analysis of causality through program theory models • Concept mapping

  20. A. Interrupted time series Alcohol-related driving accidents Anti-drinking law Monthly reports of driving accidents

  21. B. Single case designs

  22. Single case designs • The same subject or group may receive the treatment 3 times under carefully controlled conditions or • different groups may be treated each time. • The baseline and posttest are rated by a team of experts – usually based on observation • If there is a significant change in each phase the treatment is considered to have produced an effect

  23. A Mixed-Method Case Study [Non-Experimental] Design Inputs Implementation Outputs Contextual analysis Process analysis Preparation and analysis of case studies Program theory + theory of change Selecting a representative sample of cases: Random or purposive Ensuring sample is large enough to generalize National household survey Defining a typology of individuals, households, groups or communities Qualitative data collection

  24. What’s New inRealWorld Evaluation? Understanding the Context

  25. The Importance of context Political context Economic context Institutional context Project implementation Outcomes Impacts sustainability Security: Environmental, conflict, domestic violence Socio-cultural characteristics

  26. What’s New inRealWorld Evaluation? Broadening the focus of program theory

  27. What’s New inRealWorld Evaluation? Benefits of Mixed-Method Designs

  28. What’s New inRealWorld Evaluation? Complex Evaluation Framework

  29. Simple projects, complicated programs and complex development interventions • country-led planning and evaluation • Non linear • Many components or services • Often covers whole country • Multiple and broad objectives • May provide budget support with no clear definition of scope or services • multiple donors and agencies • context is critical Large, complex Complex interventions • May include a number of projects and wider scope • Often involves several blueprint approaches • Defined objectives but often broader and less precise and harder to measure • Often not time-bound • Context important • multiple donors and national agencies Complicated programs Simple projects Small, simple • “blue print” producing standardized product • relatively linear • Limited number of services • Time-bound • defined and often small target population • Defined objectives

  30. The Special Challenges of Assessing Outcomes for Complex Programs • Most conventional impact evaluation designs cannot be applied to evaluating complex programs • No clearly defined activities or objectives • General budget and technical support integrated into broader government programs • Multiple activities • Target populations not clearly defined • Time-lines may not be clearly defined

  31. Special challenges continued • Multiple actors • No baseline data • Difficult to define a conventional comparison group

  32. Alternative approaches for defining the counterfactual for complex interventions 1. Theory driven evaluation 3. Quantitative approaches • Qualitative approaches • Mixed method designs • Rating scales • Integrated strategies for strengthening the evaluation designs

  33. STRATEGIES FOR EVALUATING COMPLEX PROGRAMS • Counterfactual designs • Attribution analysis • Contribution analysis • Substitution analysis Estimating impacts The value-added of agency X Theory-based approaches Net increase in resources for a program Mixed method designs • strengthening alternative counterfactuals • “Unpacking complex programs” • Portfolio analysis • Reconstructing baseline data • Creative use of secondary data • Secondary data • Triangulation Qualitative approaches Quantitative approaches Rating scales

  34. What’s New inRealWorld Evaluation? Greater Focus on Responsible Professional Practice

  35. What’s New inRealWorld Evaluation? Quality Assurance and Threats to Validity

  36. Quality assurance framework Threats to validity worksheets Objectivity/ credibility Internal validity Design validity Quantitative Statistical validity Qualitative Construct validity External validity Mixed method

  37. What’s New inRealWorld Evaluation? Organizing and Managing Evaluations

  38. Organizational and management issues • Planning and managing the evaluation • Preparing the evaluation • Recruiting the evaluators • Designing the evaluation • Implementing the evaluation • Reporting and dissemination the evaluation findings • Ensuring the implementation of the recommendations

  39. Organization and management [continued] • Building in quality assurance procedures • Designing “evaluation ready” programs • Evaluation capacity development • Institutionalizing impact evaluation systems at the country and sector levels

  40. What’s New inRealWorld Evaluation? The Road Ahead

  41. The RWE Perspective on the Methods Debate: Limitations of RCTs • Inflexibility. • Hard to adapt sample to changing circumstances. • Hard to adapt to changing circumstances. • Problems with collecting sensitive information. • Mono-method bias. • Difficult to identify and interview difficult to reach groups. • Lack of attention to the project implementation process. • Lack of attention to context. • Focus on one intervention. • Limitation of direct cause-effect attribution.

  42. Consequences Consequences Consequences DESIRED IMPACT OUTCOME 1 OUTCOME 2 OUTCOME 3 A more comprehensive design OUTPUT 2.3 OUTPUT 2.1 OUTPUT 2.2 A Simple RCT Intervention 2.2.1 Intervention 2.2.2 Intervention 2.2.3

  43. To attempt to conduct an impact evaluation of a program using only one pre-determined tool is to suffer from myopia, which is unfortunate. On the other hand, to prescribe to donors and senior managers of major agencies that there is a single preferred design and method for conducting all impact evaluations can and has had unfortunate consequences for all of those who are involved in the design, implementation and evaluation of international development programs.

  44. The RWE Perspective on the Methods Debate: Limitations of RCTs In any case, experimental designs, whatever their merits, can only be applied in a very small proportion of impact evaluations in the real world.

  45. What else do we address in the “Road Ahead” final chapter? Mixed Methods: The Approach of Choice for Most RealWorld Evaluations Greater Attention Must Be Given to the Management of Evaluations The Challenge of Institutionalization The Importance of Competent Professional and Ethical Practice The Importance of Process Creative Approaches for the Definition and Use of Counterfactuals Strengthening Quality Assurance and Threats to Validity Analysis Defining Minimum Acceptable Quality Standards for Conducting Evaluations Under Constraints

  46. Bamberger Rugh Mabry RealWorld Evaluation RealWorld Evaluation Working Under Budget, Time, Data, and Political Constraints Michael Bamberger Jim Rugh Linda Mabry EDITION 2 EDITION • This book addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints and where critical data may be missing. The book is organized around a seven-step model developed by the authors, which has been tested and refined in workshops and in practice. Vignettes and case studies—representing evaluations from a variety of geographic regions and sectors—demonstrate adaptive possibilities for small projects with budgets of a few thousand dollars to large-scale, long-term evaluations of complex programs. The text incorporates quantitative, qualitative, and mixed-method designs and this Second Edition reflects important developments in the field over the last five years. • New to the Second Edition: • Adds two new chapters on organizing and managing evaluations, including how to strengthen capacity and promote the institutionalization of evaluation systems • Includes a new chapter on the evaluation of complex development interventions, with a number of promising new approaches presented • Incorporates new material, including on ethical standards, debates over the “best” evaluation designs and how to assess their validity, and the importance of understanding settings • Expands the discussion of program theory, incorporating theory of change, contextual and process analysis, multi-level logic models, using competing theories, and trajectory analysis • Provides case studies of each of the 19 evaluation designs, showing how they have been applied in the field • “This book represents a significant achievement. The authors have succeeded in creating a book that can be used in a wide variety of locations and by a large community of evaluation practitioners.” • —Michael D. Niles, Missouri Western State University • “This book is exceptional and unique in the way that it combines foundational knowledge from social sciences with theory and methods that are specific to evaluation.” • —Gary Miron, Western Michigan University • “The book represents a very good and timely contribution worth having on an evaluator’s shelf, especially if you work in the international development arena.” • —Thomaz Chianca, independent evaluation consultant, Rio de Janeiro, Brazil 2 EDITION RealWorld Evaluation MANDATORY SPACE REQUIRED BY BANG FOR SUSTAINABLE FORESTRY INITIATIVE LOGO

More Related