Randomized impact evaluations
This presentation is the property of its rightful owner.
Sponsored Links
1 / 53

Randomized Impact evaluations PowerPoint PPT Presentation


  • 122 Views
  • Uploaded on
  • Presentation posted in: General

Randomized Impact evaluations. Vandana Sharma, MD, MPH Handicap International Meeting, Dec 4, 2013. Outline. The Abdul Latif Jameel Poverty Action Lab (J-PAL) What is evaluation Randomized evaluations What is randomized evaluation How to conduct a randomized evaluation

Download Presentation

Randomized Impact evaluations

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Randomized impact evaluations

Randomized Impact evaluations

Vandana Sharma, MD, MPH

Handicap International Meeting, Dec 4, 2013


Outline

Outline

  • The Abdul Latif Jameel Poverty Action Lab (J-PAL)

  • What is evaluation

  • Randomized evaluations

    • What is randomized evaluation

    • How to conduct a randomized evaluation

    • Challenges and difficulties

  • Example 1: Education and HIV in Kenya

  • Example 2: Voluntary Health workers in Nigeria


Abdul latif jameel poverty action lab j pal science into action

Abdul LatifJameel Poverty Action Lab (J-PAL): Science into Action

  • J-PAL is

    • A center within MIT department of Economics, established in 2003

    • A network of researchers around the world

  • Dedicated to ensure the fight against poverty is based on scientific evidence

    • In particular our focus is on learning lessons from the randomized evaluations of anti-poverty projects (poverty broadly defined)

  • What do we do?

    • Conduct rigorous impact evaluations

    • Build capacity

    • Impact policy


J pal network of economists running rcts

J-PAL - Network of economists running RCTs

91 academics, 441 evaluations in more than 55 countries worldwide


What is evaluation

What is Evaluation?

  • A systematic method for collecting, analyzing and using information to answer questions about policies and programs

  • Process Evaluation:

    • Did the policy/program take place as planned?

  • Impact Evaluation:

    • Did the policy/program make a difference?


Evaluation is crucial

Evaluation is Crucial

  • Resources are limited

  • Little hard evidence on what is the most effective

  • Many decisions are based on intuition or on what is in fashion

  • Rigorous evaluations allow accountability


Evaluation is useful

Evaluation is Useful

  • Helps policymakers to better invest

  • Ameliorate programs

  • Identify best practices


Randomized impact evaluations

The Lancet, February 13, 2010


Impact evaluations

Impact evaluations

  • Impact evaluations measure program effectiveness by comparing outcomes of those (individuals, communities, schools, etc.) who received program and those who did not

  • Objective is to measure the causal impact of a program or intervention on an outcome

    • Examples:

    • How much did free distribution of bednets decrease malaria incidence?

    • How much did an information campaign about HIV reduce risky sexual behavior?

    • Which of two supply chain models was most effective at eliminating drug shortages?


Impact evaluations1

Impact Evaluations

  • In order to attribute cause and affect between the intervention and the outcome, we need to measure the Counterfactual

    • What would have happened to beneficiaries in the absence of the program?


Impact what is it

Impact: What is it?

Intervention

Counterfactual

PrimaryOutcome

Impact

Time


Impact what is it1

Impact: What is it?

Intervention

Counterfactual

Primary Outcome

Impact

Time


Impact what is it2

Impact: What is it?

Impact =

  • The outcome some time after the program has been introduced

vs.

  • The outcome at that same point in time had the program not been introduced (the ”counterfactual”)


Impact evaluations2

Impact Evaluations

  • BUT – We can’t observe the same individual with and without the program at the same point in time!

  • Since counterfactual is non-observable, the key goal of all impact evaluation methods, is to construct or mimic the conterfactual

  • Need an adequate comparison group

    • Individuals who, except for the fact that they were not beneficiaries of the program, are similar to those who received the program

  • Could be done by:

    • Before and After

    • Using the non-beneficiaries as the control group


Before and after

Before and After

Before Introduction of

Bednets

After Introduction of Bednets

2 malaria episodes in 6 months

  • 6 malaria episodes in 6 months


Before and after1

Before and After

  • Was the bednet program effective at reducing malaria incidence?

  • Are there other factors which could have led to the observed reduction?

    • Seasonal changes

    • Rising income- households invest in other measures

    • Other programs


Before and after2

Before and After

  • Important to monitor before-after

  • Insufficient to show impact of program

  • Too many factors changing over time

  • Counterfactual: What would have happened in the absence of the project, with everything else the same


Participants vs non participants

Participants vs Non-Participants

  • Compare recipients of the program to

    • People who were not eligible for the program

    • People who chose not to enroll/participate in the program

  • Example: After bednet distribution, compare households with bednets vs those without

Impact of bednets?


Participants vs non participants1

Participants vs Non-Participants

  • What else could be going on?

  • People who choose to get the bednet may be different than those who do not

  • Observable Differences

    • Income

    • Education

  • Unobservable Differences

    • Risk factors

    • Other preventative measures


Participants vs non participants2

Participants vs Non-participants

  • No way to know how much of difference is due to the bednets

Other Factors

Impact of bednets


Participants vs non participants3

Participants vs Non-Participants

  • Non-beneficiaries may be very different than beneficiaries

    • Programs are often targeted to specific areas (ex poorer areas, areas that lack specific services)

    • Individuals are often screened for participation in program

    • The decision to participate is often voluntary

  • Thus non-beneficiaries are often not a good comparison group because of pre-exisiting differences (selection bias)

  • Selection bias disappears in the case of randomization


Randomized impact evaluations

RANDOMIZED EXPERIMENTAL DESIGN IS THE GOLD STANDARD


Random assignment

Random Assignment

  • Identify a large enough group of

    individuals who can all benefit

    from a program

  • Randomly assign them to either:

    • Treatment Group: will benefit from the program

    • Control Group: not allowed to receive the program (during the evaluation period)

  • Random assignment implies that the distribution of both observable and unobservable characteristics in treatment and control groups is statistically identical.


  • Random assignment1

    Random Assignment

    • Because members of the groups (treatment and control) do not differ systematically at the onset of the experiment,

    • any differences that arise between them can be attributed to the program (treatment) rather than to any other factors

    • If properly designed and conducted, randomized experiments provide the most credible method to estimate the impact of a program


    Random assignment2

    Random Assignment

    • Randomization with two (individuals or groups) doesn’t work!

      Treatment Control

    • But differences even out in a large sample

    Big Differences between treatment and control

    On average same number of red and blues in treatment and control


    Can we randomize

    Can we randomize?

    • Randomization does not mean denying people the benefits of the project

    • Usually there are existing constraints within project implementation that allow randomization

    • Randomization is the fairest way to allocate treatment


    How to introduce randomness

    How to introduce Randomness

    • Organize lottery

    • Randomize order of phase-in of a program

    • Randomly encourage some more than others

    • Multiple treatments


    Phase in of program

    Phase in of Program

    • Randomize the order in which clinics receive the program

    • Then compare Jan 2014 group to Jan 2015 group at the end of the first year

    Jan 2014 Jan 2015July 2015


    If some groups must get the program

    If some groups must get the program

    Highly vulnerableModerately vulnerable Not vulnerable

    • Example – a program for children in Kenya

      • Highly vulnerable children (orphans) must get the program

      • Randomize among less vulnerable children

    ENROLL

    RANDOMIZE

    SORRY


    Vary treatment intensity and nature

    Vary treatment intensity and nature

    Intensity

    Nature

    Randomize across communities

    Which approach has a greater impact?

    • Randomize across communities

    • Additional impact of SMS reminders

    HIV/AIDS information campaign

    100 villages

    HIV/AIDS information campaign + SMS reminders

    100 villages

    HIV/AIDS information campaign (radio)

    100 villages

    HIV/AIDS information campaign (newspaper)

    100 villages


    Unit of randomization

    Unit of Randomization

    • At what level should I randomize?

      • Individual

      • Household

      • Clinic

      • Community

    • Considerations

      • Political feasibility of randomization at individual level

      • Spillovers within groups

      • Implementation capacity: One clinic administering different treatments


    Unit of randomization1

    Unit of Randomization

    Individual randomization

    Clinic-level randomization

    150 clinics (75 treatment, 75 control)

    3000 participants

    • 630 participants (315 treatment, 315 control)

    Bigger Unit = Bigger Study


    Advantages of randomized evaluations

    Advantages of Randomized Evaluations

    • Results are transparent and easy to share

    • Difficult to manipulate or to dispute

    • More likely to be convincing


    Limitations

    Limitations

    • Cannot always be used (ex political or ethical reasons)

    • Internal validity issues: power, attrition, compliance, etc.

    • External validity issues: sample size, generalizability of results to the population of interest

    • These issues often affect the validity of non-experimental studies as well


    Randomized impact evaluations

    EXAMPLE #1

    Evaluating the School-based HIV education programs among youth in Kenya


    Education and hiv aids in kenya

    Education and HIV/AIDS in Kenya

    • Esther Duflo, Pascaline Dupas, Michael Kremer, Vandana Sharma


    Background information hiv aids in kenya

    Background Information: HIV/AIDS IN KENYA

    • Kenya AIDS Indicator Survey (KAIS)

      • August – December 2007

      • Sampled 18,000 individuals aged 15 to 64 yrs from 10,000 households across Kenya

    • OVERALL : 7.4% of Kenyans are HIV+

      8.7% of women are HIV+

      5.6% of men are HIV+

    • More than 1.4 million Kenyans are living with HIV/AIDS

    National AIDS and STI Control Programme, Ministry of Health, Kenya. July 2008. Kenya AIDS Indicator Survey 2007: Preliminary Report. Nairobi, Kenya.


    Hiv aids in kenya

    HIV/AIDS in Kenya


    School based hiv prevention interventions

    School-Based HIV Prevention Interventions

    • Education has been called a “social vaccine” for HIV/AIDS

    • Children aged 5-14 yrs have been called a “window of hope” because

      • they have low HIV infection rates

      • their sexual behaviors are not yet established and may be more easily molded

  • In Africa most children now attend some primary schools

  • School-based HIV prevention programs are inexpensive, easy to implement and replicate

  • There is limited rigorous evidence about the effectiveness of these types of programs


  • Background study design

    Background - Study design

    • Between 2003-2006, non-profit organization ICS implemented HIV prevention programs in 328 primary schools in Western Kenya

    • Schools were randomly assigned to receive none, one or both of the following interventions:

      • Teacher Training in Kenya’s national HIV/AIDS education curriculum

        • National HIV curriculum focuses on abstinence until marriage and does not include condom information

        • Program provided in-service training to 3 upper-primary teachers to enhance delivery of the curriculum

      • Uniforms Distribution Program

        • Provided two free uniforms for one cohort of students (girls and boys), with the aim of helping them stay in school longer (second uniform provided 18 months after first)


    Background study design1

    Background – Study design

    • Study Location: Butere, Mumias, Bungoma South and Bungoma East districts in Western Province

    • Study Sample : 19,300 youths (approx half females) enrolled in Grade 6 in 2003 (~13 years)

    • Experimental Design:


    Results

    Results

    • Teacher Training:

      • Teachers were more likely to discuss HIV in class

      • Had little impact on knowledge, self-reported sexual activity, or condom use.

      • Increased tolerance toward people with HIV

      • No effect on pregnancy rates 3 years and 5 years later


    Results1

    Results

    • Uniforms program:

      • Reduced dropout rates (by 17% in boys and 14% in girls)

      • Reduced the rate of teen childbearing

      • From 14.4% to 10.6% after 3 years

      • From 30.7% to 26.1% after 5 years


    Hiv aids and education in western kenya a biomarkers follow up study

    HIV/AIDS and Education in Western Kenya: A Biomarkers Follow-up Study

    • Objective: To study the impact of the teacher training, and uniforms programs on actual transmission of STIs and HIV

    • Self-reported data is often unreliable especially with respect to sexual behavior

    • Changes in knowledge or attitudes do not necessarily translate into sustained behavior change


    Study design

    Study Design

    • A cross-sectional survey to measure HSV-2 prevalence and behavioral outcomes was administered to subjects between February 2009 and March 2011

      • Six to eight years after interventions

      • Note: not powered to estimate impacts on HIV


    Randomized impact evaluations

    328 schools

    in Western Kenya

    Teacher

    Training

    Free

    Uniforms

    Control

    HSV-2 Prev

    KAP

    HSV-2 Prev

    KAP

    HSV-2 Prev

    KAP

    Random

    Assignment

    Programs

    offered in 2003

    Follow up in

    2009-2010

    KAP= Knowledge, Attitudes and Practices


    Results i hsv 2 infection 7 yrs post intervention

    Results IHSV-2 infection 7 yrs post-intervention


    Results ii marriage childbearing 7 yrs post intervention

    Results II Marriage & Childbearing 7 yrs post-intervention

    • Uses sampling weights (those sampled during IT have higher weight)

    • Controlled for age at baseline, randomization strata (school location, sex ratio and performance at baseline), date of survey/blood draw


    Conclusion

    Conclusion

    • Education subsidy (free uniforms) is effective at reducing teenage marriage and childbearing rates but not enough to reduce HSV-2 transmission

    • National HIV curriculum focused on abstinence until marriage seems ineffective in reducing HSV-2 transmission

    • Two programs implemented jointly appear to reduce HSV-2 transmission


    Randomized impact evaluations

    Thank you

    Vandana Sharma, MD, MPH

    Abdul LatifJameel Poverty Action Lab (J-PAL)

    Massachusetts Institute of Technology

    [email protected]

    http://www.povertyactionlab.org/


  • Login