1 / 28

Impact evaluation of the Tanzania women’s virtual business incubator

Impact evaluation of the Tanzania women’s virtual business incubator. Dakar February 1-4, 2010. Elena Bardasi , World Bank, PREMGE Alaka Holla , World Bank, AFTPM. Outline. Project background Objectives Activities Features Impact evaluation Learning objectives and rationale Set up

gyula
Download Presentation

Impact evaluation of the Tanzania women’s virtual business incubator

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact evaluation of the Tanzania women’s virtual business incubator Dakar February 1-4, 2010 Elena Bardasi, World Bank, PREMGE AlakaHolla, World Bank, AFTPM

  2. Outline • Project background • Objectives • Activities • Features • Impact evaluation • Learning objectives and rationale • Set up • Implementation issues

  3. Promotion of Women’s Entrepreneurship:Tanzania Virtual Business Incubator • Intervention to support women’s entrepreneurs through delivery of training and BDS • Why entrepreneurship? • In Tanzania: 2.7 million enterprises producing 30% of GDP • 98% are microenterprises (<5 employees, <US$5,000) • The large majority of Tanzanian in the LF are informal • Why women? • 80% of microentrepreneurs in Tanzania are women • Women have higher constraints than men, and specific constraints  Choice of a VIRTUAL incubator (‘without walls’)

  4. Tanzania Virtual Business Incubator:Objectives • Support the growth of women-owned businesses through delivery of BDS to strengthen their skills (financial literacy, market outreach, PD&D, etc.) • Through supporting women’s entrepreneurship the project aims to: • Increase women’s social and economic empowerment • Improve women’s well-being as well as their household • Improve children’s outcomes • Target is the micro/very small entrepreneurs but who wants her business to grow

  5. Tanzania Virtual Business Incubator:Components • Development of a Virtual Business Incubator • Impact evaluation and monitoring framework • Impact evaluation is a ‘structural’ part of the project • Impact evaluation is very different from M&E. Both IE and a strong M&E framework are needed to evaluate the project and measure/understand its impact • Communication and dissemination

  6. Tanzania Virtual Business Incubator:Component #1: Activities • Pilot project in Dar es Salaam • Delivery of training and BDS to 500 women, but 750 are ‘targeted’ (250 are in the control group) • AIDOS model • Virtual incubator (3rd generation/incubator w/out walls) • Tailor-made portfolio of resources and support services • Attention to improving product quality and design • Market-oriented focus • Development of a network of mentors

  7. Tanzania Virtual Business Incubator:The training package

  8. Tanzania Virtual Business Incubator:Features of the project • BDS delivery is at the core of the project • ‘Hybrid’ project: PREM – PSD • Project born with its impact evaluation • Implemented by an NGO. Team includes: PRMGE; AFTPM; AIDOS (Italian NGO, Technical advisor); Tanzania Gatsby Trust (local implementing partner working in consortium with IMED, SIDO-WED, Kwanza collection); ETC based in Dar • It is a pilot, but sustainability and capacity building are central • Learning from IE useful for both research and implementation (scaling up, adjustments, new areas) • Cost: about US$1.2 million (of which about US$150,000 for the impact evaluation)

  9. Promotion of Women’s Entrepreneurship:Tanzania Virtual Business Incubator Mpango wa kukuza ujasiriamali na biashara kwa wanawake BIG: Program to make entrepreneurship and women enterprises grow

  10. Learning objectives of the IE • To what extent does business training affect enterprise outcomes of female entrepreneurs in Dar es Salaam? • Main indicator: sales revenue • What kind of program works (doesn’t work)? • Business training • Business training + individualized attention from coaches/mentors • Does the training lead to any unintended consequences (good or bad)? • Debt • Depression • Domestic violence • Human capital investments in children • More female decision-making within the home

  11. Why can’t we just do regular monitoring? • Regardless of change in revenue (+ or -) wouldn’t know if program helped anyone • Simple before and after comparison of beneficiaries • Gives program impact + whatever else happened between start & end of program • What if there was a lighting strike in the middle of the program that burned down the major market place of beneficiaries? • What if cheap imports flood the market? • What if a female MP starts promoting beneficiaries’ products? • Cannot disentangle these effects from program impact • Could underestimate or overestimate true impact of the program.

  12. Why can’t we just do regular monitoring? • Comparing people who received program to people who did not receive the program • Gives program effect + whatever is different between participants and non-participants. • Usually there is a reason people choose to join or not join a training program, and we usually can’t observe this ex-ante. • Could be people who are very motivated join the program  will overestimate program impact • But could be people who won’t be successful without help , who just suffered an enterprise-related failure will underestimate program impact • Impossible to disentangle these unobservable characteristics of participants from program impact

  13. Why can’t we just do regular monitoring? • Baseline + endline surveys • Can at many outcomes of interest • Knowing about spillovers in the household • More accurate calculations of cost-benefit ratios • Can think about necessary complementary services for scale-up • Example: If depression or debt increases, can think about pairing program with counseling or financial literacy training • Experimentation • Can learn about effectiveness of 2 different program variants relative to same baseline

  14. Basic set up • Local implementation partner • Advertises program [ongoing] • Screens applicants down to 750 entrepreneurs [March] • Baseline survey [April-May] • Random assignment of 750 women to 3 groups [May] • Treatment 1: Traditional business training [250] • Treatment 2: Treatment 1 + coaching & mentoring [250] • Comparison [250] • Program implementation [May] • Endline survey 1 year after program start [June-July 2011]

  15. Implementation issues • Program eligibility • Where does the number 750 come from? • Baseline survey • Treatment assignment • Simultaneous treatment • Date for endline survey

  16. Program eligibility • Might not be able to say anything about entire population female entrepreneurs in Dar but need to be clear about population of interest • Ideally entrepreneurs comparable to targeted beneficiaries in a scaled-up version of the program • Target those with highest potential to benefit  cannot say that we expect same results during scale-up • Target those with very limited potential  no effect of program in impact evaluation  no rationale for scale-up • TZ VBI limited to • Entrepreneurs established for at least 1 year • Certain sectors with growth potential identified in market study • Entrepreneurs willing to pay upfront commitment fee • Estimated program impact most relevant for women who meet these criteria

  17. Why 750 women? • Answer: Power calculations for sales + capacity of local implementation partner • Power calculations from TZ Enterprise Survey (2006) • 10% increase: 1,079 in each group  2,158 total treated • 15% increase: 480 in each group  960 total treated • 20% increase: 270 in each group  540 total treated • 25% increase: 173 in each group  346 total treated • 30% increase: 120 in each group  240 total treated • An impact of a 15% increase in sales cannot be distinguished from zero impact with confidence • OK because this is an expensive intervention; not interested knowing about small effects • Caveat: Limited observations on female small business owners

  18. What do we tell these 750 women? • Before narrowing women down to 750 • Need to say All of you cannot participate in the training program this year. We don’t have the capacity this year but we can train you next year. For this year’s slots, there will be a lottery. • Some women will say Then forget it! and then drop out • Treatment assignment occurs on remaining set of women • To keep all 750 women engaged • All given an incubator “ID” card that has no meaning outside of the program

  19. Baseline survey • Occurring before assignment of treatment • Disappointment or enthusiasm about treatment status won’t affect responses or behaviors • But risk that women think responses could affect probability of treatment • Can use information in baseline (sector, neighborhood) to stratify treatment • Roughly 6 weeks before program start

  20. Baseline survey • Survey firm completely separate from local implementation agency • Pros • Survey firm has no incentive to find an impact • Local implementation agency does not have access to completely private information on trainees • Cons • Makes locating women more difficult • No in-house capacity for continuing the impact evaluation or starting new ones

  21. What can we learn from the baseline survey itself? • Descriptive data on characteristics we know little about (e.g. debt, inventories, suppliers) • Strata • Can also experiment with data collection methods to see how these affect responses (especially to sensitive questions) • Type of non-monetary compensation for survey participation (personal versus business) • Order of instruments (household and business) • Very cheap or costless • Need to stratify on these “treatment arms” when assigning real treatment

  22. Assignment of treatment • Need final list of eligible women from local implementation partner • Use computer program to randomly assign them to 3 groups, stratified on • Sector • Geography • Baseline experimental group • Send list with group assignment back to local implementation partner. • Local implementation partner sends out notices to women only telling them whether or not they have been accepted into the program (not specific treatment arm)

  23. Treatment • Treatment will be roughly simultaneous • That is, no phase-in • Need to find many trainers, coaches, and mentors • All groups get basic treatment first • Then 250 subset gets more intensive variant afterwards

  24. Tanzania VBI Impact Evaluation All applicants > 750 Treatment group 500 Eligible applicants 750 Baseline survey Assign applicants to geographical and sectoral strata Comparison group 250 Assign people to strata again Basic treatment 250 Basic treatment + mentoring 250

  25. Endline survey • Approximately 1 year after program start • Why? • Simple answer: Project cycle • Not clear ex ante how long it would take for effects to materialize • Ideally, a second endline 3+ years later

  26. What will we learn with this set-up? • Relative to the status quo, what is the impact of business training on enterprise and household outcomes of female entrepreneurs in certain sectors in Dar es Salaam? • Status quo includes other training programs. Cannot prevent comparison group from getting their own training. • Probably not powered enough to compare treatment variants unless impact of individualized services really large • Do survey design features (like compensation or order of instruments) affect responses to sensitive questions (debt, depression, violence, transactional sex, etc)?

  27. Do we still need monitoring? • Impact evaluation does not fill all of a project’s monitoring needs • Most importantly, it typically does not track process • If mentoring does not work, is this b/c mentoring really doesn’t work or b/c mentors never met with the beneficiaries? • Did trainers show up in the classroom, did beneficiaries show up? • Implementation agency and field coordinator needs to systematically and quantitatively track this kind of information.

  28. Broader monitoring framework • Includes variables difficult to collect in baseline/endline survey (for example, qualitative variables) • e.g. increase in the quality of product, product differentiation, woman’s assertiveness, etc.) • Includes variables capturing process/implementation • e.g. number of women reached at each stage; number of visits by coaches; attendance to training sessions; quality of trainers, etc. • Includes variables assessing increased capacity of the local partner, team performance in dissemination, etc. • e.g. number of papers and policy notes; partnership and collaborations stimulated by the project, etc. • Uses a variety of tools • e.g. coach logs, evaluation forms, focus groups, etc.

More Related