1 / 14

Impact Evaluation of Urban Upgrading Programs

Impact Evaluation of Urban Upgrading Programs. Judy Baker, FEU November 19, 2007. Impact Evaluation Program. Mainstreaming Impact Evaluations in relevant Bank Projects Cluster of Evaluations to evaluate impact of urban upgrading projects

hester
Download Presentation

Impact Evaluation of Urban Upgrading Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Evaluation of Urban Upgrading Programs Judy Baker, FEU November 19, 2007

  2. Impact Evaluation Program • Mainstreaming Impact Evaluations in relevant Bank Projects • Cluster of Evaluations to evaluate impact of urban upgrading projects • Afghanistan, Brazil, Colombia, Honduras, India, Indonesia, Iran, Jamaica, Nigeria, Tanzania • Part of Development Impact Evaluation (DIME) Program; First SDN cluster • Project designs vary, IE designs also vary. • Usually include physical upgrading (water and sanitation, roads, housing improvements, street lighting, etc.) • Sometimes include social components (job training, social infrastructure, crime prevention, community development) • Sometimes include land tenure

  3. Main research questions • What are the impacts of upgrading projects? • Welfare (employment, income and consumption) • Land and housing values • Health and Education outcomes • Community (social capital) • Tenure outcomes where feasible • Do specific design features have variable impacts? • How cost effective are interventions? • Targeting outcomes ? • Meta Analysis will be done in two parts: Baseline, and IE Results.

  4. Issues in designing the evaluations • Identifying appropriate and interested projects (TTLs, Government, Resources) • Staffing, Resources • Quasi-experimental designs, very difficult to randomize in infrastructure projects • Are there existing data sets ? • Implementation of baseline needs to be timely • Mobility in urban areas tends to be high

  5. Example: Bahia Integrated Urban Upgrading • PDO: to reduce urban poverty in a sustainable manner, targeting the poorest and most vulnerable sections of Salvador and strategic cities in Bahia with access to basic services and improved housing and social support services. • Investment loan including: • Urban Infrastructure (65%) • Social Service Delivery (25%) • Institutional Strengthening (10%) • M&E (1.2%)

  6. Monitoring and Evaluation • Task manager initiated dialogue • Drew on pilot project • Government had strong commitment to evaluation, relationships already established • Counterpart team assigned to preparation of M&E design • Approach discussed at stakeholders meeting during project preparation

  7. M&E Design • Monitoring and Process Evaluation • Input from MIS, existing administrative data • Reporting carried out quarterly by implementing agency • Participatory monitoring based on continuous feedback from beneficiaries via Community Offices • Impact Evaluation • Rigorous design incorporating quantitative and qualitative methods, propensity score matching • Baseline, follow up surveys

  8. Impact Evaluation Design • Baseline carried out for all households in selected communities also as input to sub-project design. • Baseline for sample households in control communities • Follow up panel surveys in sample communities (project and control) one year after sub-project implementation, 5 years after, 9 years after. • Over-sampling of households in follow up years to maintain sample sizes.

  9. Data resources • MIS: Designed under the project – includes data on spending, material inputs, number of beneficiaries, project status • Household survey: Specially designed includes physical cadastre and socio-economic profile • Focus Groups: In project communities • Land and housing valuation: for sample households • Epidemiological survey: Rapid survey already in use • Secondary data: Existing administrative data for monitoring (e.g. spending, etc.), school records, health records, mortality.

  10. Evaluation Indicators • How has the project impacted on: • Basic infrastructure services (access, quality, use and affordability) • Housing improvements (land and housing values, tenure, ownership structure, private investment in housing, microfinance) • Economic outcomes (income and consumption, employment status, local economic activity) • Welfare outcomes (poverty and inequality) • Health outcomes (access to health care, infant mortality, malnutrition, water-borne disease, illness) • Education (school attendance, achievement, literacy)

  11. Evaluation indicators, ctd. • Security outcomes: (crime and violence) • Migration (mobility in and out of slums) • Environmental outcome (flooding, waste management, precarious housing) • Participation and social capital (level of association and communal activity) • Institutions (perceptions) • Cost/benefit analysis • Targeting analysis

  12. Institutional arrangements • CONDER responsible for management and implementation of M&E plan. (small unit) • Baseline and follow up surveys contracted out to SEI, state-run statistical agency • Assistance in design and analysis of impact evaluation to be contracted out, with input from the World Bank • Supervision included in BB and with funds from DIME

  13. Current Status • Major delays in negotiations, implementation of project • Major change in Government • Baseline is now in the field

  14. Some lessons • The project will drive the evaluation, not the other way around ! • Not appropriate for all projects, but there are some excellent opportunities to do IE. • A cluster of thematic evaluations can really contribute to the knowledge base • Best opportunities with interested TTL, Government counterpart • Involve IE specialist from the start, continuity is very valuable • Staffing can be a challenge, draw on local expertise if available • Build supervision plans into the project • Develop a prototype survey instrument • Realistic expectations: measuring impacts requires time

More Related