1 / 15

How to Evaluate in a Difficult Country Situation

How to Evaluate in a Difficult Country Situation. A Tale of Two Years. Tazeen Fasih Human Development Economist. Summary. Project Background Introducing the Concept Why Evaluate Research Questions Targeting Unit of Randomization Sample Size Cost (mis)estimates

takoda
Download Presentation

How to Evaluate in a Difficult Country Situation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Evaluate in a Difficult Country Situation A Tale of Two Years Tazeen Fasih Human Development Economist

  2. Summary • Project Background • Introducing the Concept • Why Evaluate • Research Questions • Targeting • Unit of Randomization • Sample Size • Cost (mis)estimates • The un-anticipated factor – Bureaucracy • TTL and the Evaluation team

  3. Project Background • BEDP a $125 mill. multi-donor project • Objective: To assist government expand provision of quality basic education for all, with particular attention to gender equality. • Objective of CCT • Encourage retention and improve enrollment of girls in grades 4-9 • Provide an incentive for learning

  4. Gross Enrollment Rate, by Grade 2004/05

  5. Introducing the Concept • Government understands the importance of monitoring and evaluation • No concept of Impact Evaluation • How to define Impact Evaluation in Arabic? • Audio conference with deputy minister and team • Introductory workshops • Constant interaction

  6. Then Why Evaluate? • Government’s Basic Education Strategy includes provision of subsidies to all basic ed. Students • Existing experiences of direct subsidies/cash transfers have evidence of leakages • Evaluate the impact of CCTs on enrollment of girls in a new setting - poor, tribal country with low women’s status.

  7. Research (policy) questions • Impact of the CCT program on education outcomes, including learning. • Examine the important behavioral changes that may take place within a household as a result of the program; and • Identify how a CCT program should be structured in Yemen so that it is culturally appropriate and successful.

  8. Targeting - 1 • Program aimed at girls in grades 4 – 9 in schools in rural areas of Yemen • Would have liked, but can’t target geographically • Problems obtaining recent data. • No targeting of households in a community based on poverty status. • Objective also to increase enrollment and not only retention, so randomization by schools not feasible.

  9. Unit of Randomization School zone level • Minimum school quality • Min. number of students in the school (50) • Must have at least up to 8th grade. • Randomization into treatment and control groups at school zone level.

  10. Sample size issues – initial evaluation design • Three treatment groups • Mothers receive cash transfer and education bonus • Fathers receive cash transfer and education bonus • Mother receives cash transfer and child receives education bonus • No age restriction • Test 1 and 2: in tribal society not clear who is most effective person • Poor status of women: given money to women could lead to domestic violence. • Test 1 and 3: if girls receive education bonus, will this lead to higher achievement scores?

  11. Cost (mis)estimations • Estimating costs from previous experience • No experiences with household surveys outside the main statistical office (CSO) • Limited capacity within the CSO • Other issues • Estimated cost from international experience • Additional costs of “importing” firms • Not factoring in the remoteness costs

  12. Sample size issues – final evaluation design • Two treatment groups • Father receives the transfer and the educational bonus • Mother receives the transfer and the educational bonus • Power calculations • 5,000 households • 100 school zones of which 67 are treatment

  13. The un-anticipated factor – Bureaucracy • Delays • In hiring a firm • Higher tender board approval • Delays at the Project Administration Unit • Flexibility • First governorate – operational pilot • Extension of the pilot • Selection of a next governorate

  14. The TTL and the evaluation team • Ideal evaluation design v. budget constraints • Technical capacity and capacity-building • Supporting the evaluation throughout • Sensitivity in the ownership of program

  15. But a happy beginning – at least • Anecdotal evidence and initial enrolment trends very positive • New Social Protection Strategy to learn from experiences • A health project seeking design advice/collaboration • Secondary Education CCT • New developments in financial transfer mechanisms – introduction of mobile ATMs

More Related