1 / 20

Program evaluation in Uruguay (2005 – 2008 and 2009 – 2010). A mixed – method approach

Program evaluation in Uruguay (2005 – 2008 and 2009 – 2010). A mixed – method approach. PERSPECTIVES ON IMPACT EVALUATION Approaches to Assessing Development Effectiveness CAIRO, March - April 2009 Ignacio Pardo Universidad de la República , Uruguay. Overview. 2005 - 2008

payton
Download Presentation

Program evaluation in Uruguay (2005 – 2008 and 2009 – 2010). A mixed – method approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program evaluation in Uruguay (2005 – 2008 and 2009 – 2010). A mixed – method approach PERSPECTIVES ON IMPACT EVALUATION ApproachestoAssessingDevelopmentEffectiveness CAIRO, March - April 2009 Ignacio Pardo Universidad de la República, Uruguay

  2. Overview 2005 - 2008 Mixed - Methods Impact Evaluation 2009 -2010

  3. “PANES”, social plan in Uruguay 2002 Crisis Poverty and indigence Long – term unemployment Very few social programs for the most poor and unemployed in the past Ministry of Social Development, created in 2005 Evaluation: accountability, feedback

  4. PANES’ programs • Conditional Cash Transfer program: “Ingreso Ciudadano” • Adult Literacy program • Educative interventions • Housing programs • … • Job training program: TRABAJO POR URUGUAY

  5. Mixed - MethodsEvaluation Increasing importance of mixed methods research in many disciplines Evaluation: early use of mixed methods in evaluation, as in other “practical” fields (education, for example) Recent systematization of mixed – methods research (Greene, 2007; Tashakkori and Teddlie, 2009)

  6. Evaluation design (+ Monitoring) QUANTITATIVE + QUALITATIVE APPROACH (Mixed Methods? We know why… But… How? When?)

  7. MixedMethodsfor “Trabajo por Uruguay” evaluation “Quan” Objectives “Qual” Objectives Survey In depth Interviews + FocusGroups “Quan” Analysis “Qual” Analysis Meta Analysiis of data

  8. Purposes for mixed methods evaluation(Greene, Caracelli & Graham) Triangulation Complementarity Development  Initiation Expansion

  9. Mixed – Methods designs

  10. QUAN & QUAL in evaluation • Usual approach: • Community self – analysis, implementation, PROCESS EVALUATION: Qualitative tools • Outputs, outcomes, IMPACT EVALUATION: Quantitative tools

  11. “Uruguay Trabaja” Impact evaluation: 2009 - 2010 • Plan level: PLAN DE EQUIDAD • Program level: URUGUAY TRABAJA • Similar to “Trabajopor Uruguay” • Evaluation design: • NEW DESIGN? YES • MIXED – METHODS APPROACH? YES • IMPACT EVALUATION? LUCKILY • QUALITATIVE IMPACT EVALUATION? PROBABLY

  12. Whynot RCT foreveryImpactEvaluationdesign? “RCTs are not always best for determining causality and can be misleading. RCTs examine a limited number of isolated factors that are neither limited nor isolated in natural settings. The complex nature of causality and the multitude of actual influences on outcomes render RCTs less capable of discovering causality than designs sensitive to local culture and conditions and open to unanticipated causal factors.” (American Evaluation Association response to US Department of Education, 2003)

  13. Whynot RCT foreveryImpactEvaluationdesign? “RCTs are not always best for determining causality and can be misleading. RCTs examine a limited number of isolated factors that are neither limited nor isolated in natural settings. The complex nature of causality and the multitude of actual influences on outcomes render RCTs sometimes less capable of discovering causality than designs sensitive to local culture and conditions and open to unanticipated causal factors.” (American Evaluation Association response to US Department of Education, 2003, and me)

  14. Can thequalitativeapproachplay a role in impactevaluation? • Increasingnumber of expertsanswering yes and tryingtodevelop a more complete answeronhow • RCT isnotthe ideal approach in every case. Itisapproapiate in certain cases (few variables, stablesituation, contextnotaffecting variables, clearexpectedoutcome…) (Perrin, Tuesday). • Lipsey – Scriven debate • QualitativeImpactEvaluationanalysistechniquessuggested, as theQualitativeComparativeAnalysis (QCA) (O’Reilly, 2008)

  15. Some years ago • “In many instances, it is difficult to know in advance the various factors impinging on outcomes and in the absence of that knowledge, one cannot empirically isolate the role of the intervention relative to other (extraneous) factors. In this context, information on processes, institutions, and on perspectives of participants (and non-participants) themselves can be very helpful. These approaches are also well suited to explain in-depth the reasons for, and character of, critical incidents and events, and the dynamics or causality of such events when they are linked into sequences or processes” (Ezemenari et al,1999) • Physicalcausalitywilloftenhave a much more inmediate and compelling “presence” thanrelationsthat are only of the “ifnot X” variety. (…) And itiseminently available to the qualitative research approach (that is) well suited to establish causality (Mohr, 1995)

  16. Interpreting and contextualizing Impact Evaluation • Interpretation of causality > causal logic • Synthesis of quantitative and qualitativemethodswillbethebestapproachtocausality (Davidson, 2005) • “Unpackingthecomplexity of thechangeprocess” (Sridharam, yesterday), looking as precipitating causes, amplyfing causes, causes of vulnerabilitytochange. And findingthebestapproachtoeach of thiskind of causation. Thinking of evidence as in a trial, lookingfortraces… • QUAL methods can answer the important question: What Makes What Works Work? (Scriven, 2008)

  17. Other types of causality? • Impact evaluation lies on counterfactual causality. This is usually regarded as the most robust approach. For good reasons. But it is only one. • At the same time, we can take into account other types of causality. Specifically, physical cause (modus operandi technique, forensic analysis, direct observation) • Reasons as causes (Berge, 2007) • “The qualitative method do not rely on any inference about the counterfactual: it relies on establishing with high probability a physical cause” (Berge, 2007) • Causal relations can beobserved (realisticframework) and dependoncontext, whichallowsqualitativeexplanationtogetintoattributionanalysis (Maxell, 2004)

  18. QualImpactevaluation: experimental logic + qualtoolsProwse (2007 and Tuesday) • QUAN methods can telluswhatand wheremeanwhile QUAL methods are bettertoolstoanswerwhyand how. • Thatiswhythelatter can bemaybethemaintool in ImpactEvaluation, in a process of “creativeexperimentalism” (Prowse, Tuesday) • LifeHistorymethod can be a powerful QUAL tool, generating QUAN and QUAL data, besides visual data ontrajectories. And providing a specificapproachoncausality

  19. So… • In thenextmonths, anevaluationdesignwillbeelaboratedfor “Uruguay Trabaja” • Mixed – methodsevaluationmaybetakedonestepfurther • Mixingmethods at earliersteps; usingthemixingfor more purposes • The concept of ImpactEvaluation can probablybeexpanded in orden togive a betterpicture of the causal relationsthatwilltake place in thecontext of theprogram • Work in progress…

  20. Thankyou!ipardo@fcs.edu.uy

More Related