1 / 23

Two Examples of Dutch Policy Evaluation: randomization and regression discontinuity

This study examines the effectiveness of the Dutch innovation voucher program using controlled experiments, random allocation, and regression discontinuity. The analysis focuses on the direct effect, persistence effect, innovation, and science-industry interaction.

Download Presentation

Two Examples of Dutch Policy Evaluation: randomization and regression discontinuity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Two Examples of Dutch Policy Evaluation: randomization and regression discontinuity Marc Van der Steeg CPB Netherlands Bureau for Economic Policy Analysis The Hague, The Netherlands m.w.van.der.steeg@cpb.nl

  2. The evaluation problem • A two-way relation • causal: policy leads to more innovation • correlation: innovative firms make more use of innovation policy instruments • How to disentangle these two relations? • add covariates to the regression equation • do highbrow econometrics (e.g. matching) • or... • Controlled experiment • experimental group and control group • random allocation • difference is causal impact

  3. The Dutch innovation voucherpilot 2004 • Goal: “Lead them to water and pay them to drink” (Angrist et al., 2006) • Introduce SMEs to public research institutes • Market-oriented incentives for research institutes

  4. Characteristics voucher program • credit note, worth max EUR 7500.- • for SME’s only • no own contribution required • application-oriented research questions • placed with a defined group of institutes • no restrictions on level of question or technology • valid for half a year • 100 vouchers available • LOTTERY if demand exceeds supply

  5. Example of voucher project • “Biodiesel from Africa” • Seeds of tropical plant Jatropha can be used for production of biodiesel • However, current oilpresses less efficient than for production of biodiesel from coleseed: 40 % lost • Diligent Energy Systems used an innovation voucher and asked Technical University of Eindhoven to improve efficiency of process • PHD-student carried out experiments • Plans to do further research on adaptation of coleseed press for Jatrophra

  6. Research questions • To what extent did the voucher affect the degree of: • Science-industry interaction • Direct effect • Persistence effect • Innovation • Product innovations • Process innovations

  7. Application process • 1044 applications on September 17th, 2004 • Lottery: • 100 winners • 944 ‘losers’

  8. Data (1) • First round of telephone interviews May, 2005 • 100 winners • 500 randomly selected losers • questions about actual and counterfactual behaviour • Response rate • 71 winners (71%) • 242 losers (48%) • Second round of telephone interviews in September 2006, with questions on: • interaction after voucher period • realized innovations after voucher period

  9. Data (2) • No significant differences between winners and losers in background characteristics: • firm size • region • sector • previous interaction with public research institutes • We can confidently attribute any differences in outcomes to the voucher policy instrument • however, we still add all available covariates to regression equation • this raises precision of effect estimates

  10. Analysis (1): Direct effect • Data on behaviour during voucher period: • 62 out of 71 (= 87%) winners commissioned a project • 20 out of 242 (= 8%) losers commissioned a project • Effect estimates • 13% of the vouchers not used (= (71-62)/71) • 8% crowding out (= 20/242) • 79% additional assignments (= 62/71 - 20/242) • Main conclusion: 8 out of every 10 vouchers additional • Voucher generates substantial more science-industry interaction

  11. Stated preferences versus revealed preferences • Survey questions: • Losers: what would you have done if you would have won a voucher • Winners: what would you have done if you would not have received a voucher • 76% winners say: without voucher, fewer projects • 86% losers say: with voucher, more projects • Conclusion: in this case, stated counterfactual behaviour gives good proxy for real causal effect • But need not always be the case!!!

  12. Other findings • No indications for effect on size of project • Most SME’s commissioned project of more or less voucher value • Indication for small timing effect (of 1 out of 10) • Voucher winners claimed that voucher did not affect number of projects, but only led them to carry out a project earlier in time. • Same analysis on two 2005 voucher pilots shows robustness of results • Van der Steeg et al. (2007) • 2005-1: 7 out of 10 • 2005-2: 5 out of 10

  13. Analysis (2): Persistence and innovation output effect • Persistence: no effect • Voucher winners do not carry out more assignments than voucher losers in 1½ year after voucher period. • Reasons for not carrying out new assignment: too expensive, no new questions, own research • Innovation output: mixed evidence • positive effect on process improvements • no significant effects on realization of new or better products and new processes • Indications for crowding out of own R&D

  14. Note on importance of method • Randomization was crucial • Applicants were not at all representative for Dutch SME’s!!! • average SME (11-50 workers): 14 % realized product innovation (CIS data) • Voucher applicants (11-50 workers) : 75 % • Failing to control for differences in innovation capacity would lead to huge overestimation of effects • Unobservable factors can be crucial, e.g. timing of having an innovative idea may determine decision to apply for an innovation subsidy

  15. Crucial issues for design and evaluation of voucher scheme • Identify measurable goals of voucher policy • Randomization is crucial for evaluation! • Sufficient vouchers and size of control group • Collect pre-treatment characteristics / behaviour of firms • via application form; administrative datasets • Make cooperation to evaluation obligatory • also for control group of losers in lottery! • Avoid abuse • e.g. print assignment to copy shop of university • Do not let losers of lottery apply in next wave • or large enough time span between waves

  16. The Dutch voucher instrument after 2004 pilot • Two new larger pilots in 2005: 1000 vouchers • Definitive instrument since 2006: • around 6000 vouchers annually • Split in small (E2500) and large (E7500) vouchers • 2010: introduction of “private” vouchers • Possibility to commission question to private knowledge provider • 2011: new government stopped voucher program (as part of large budget cuts)

  17. Dutch cohesion policy (1)Regression discontinuity • 2007 plans by government to improve 83 most disadvantaged neighborhoods • Substantial budget of 350 million euro per year • Neighborhood action plans with goals to improve housing, jobs, education, integration and safety.

  18. Dutch cohesion policy (2)Setup and evaluation design • Ranking of neighborhoods on various socio-economic indicators: worst 83 have been selected • Evaluation design: • Exploit ranking with cut-off at neighborhood 83 • Exploit availability of before and after information on range of relevant outcome indicators • Diff-in-diff: compare development in 1-83 with that in 84-183 • ‘ Local’ diff-in-diff design: 50-83 versus 84-117

  19. Same pre-trends supports common trend assumption: Liveability index (scale 1-7)

  20. Common pre-trends also for wide range of other characteristics % of social housing % of non-western immigrants

  21. No effect on primary school test scores after one year But we need to measure later-year effects…:

  22. Other evaluation plans:Innovation and Science policy • Innovation loans • Selection of proposals on basis of certain criteria / judgements • Compare performance of just selected versus just not selected companies • Scholarships for talented researchers “Veni & Vidi grants” • Selection of researchers on basis of ranking of proposals by comittee • Compare scientific output & careers of just selected versus just not selected applicants

  23. Contact • Report on innovation vouchers: http://www.cpb.nl/eng/pub/cpbreeksen/discussie/58/ • Contact: m.w.van.der.steeg@cpb.nl

More Related