1 / 19

Evaluating Community Development Outcomes and Impacts Exploring Innovation Conference

Innovation Headaches:. Evaluating Community Development Outcomes and Impacts Exploring Innovation Conference Session on Urban Information and Markets May 3, 2007 St. Louis, MO Nancy Pindus The Urban Institute www.urban.org. Introduction. Communities are complex systems.

hovan
Download Presentation

Evaluating Community Development Outcomes and Impacts Exploring Innovation Conference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Innovation Headaches: Evaluating Community Development Outcomes and Impacts Exploring Innovation Conference Session on Urban Information and Markets May 3, 2007 St. Louis, MO Nancy Pindus The Urban Institute www.urban.org

  2. Introduction • Communities are complex systems. • Interventions don’t occur in a vacuum. • So, evaluating community development outcomes and impactsis not simple. • Using examples, this presentation will help to address the tough questions and offer some promising approaches.

  3. Program Evaluation Terms • Logic Model • Describes theory, operation, and performance • Inputs ActivitiesOutputs Intermediate outcomes End outcomes • Process Evaluation • Documents inputs, outputs, and implementation • Assesses program activities • Outcome Evaluation • Measures purported effects of completed activities: outcomes; effectiveness; costs and benefits • Impact Evaluation • Attempts to show that the outcome observed was caused by the program

  4. Sample Studies • Small Business Administration (SBA) Program Evaluation • Assess agency progress in meeting objectives • Inform agency goal and target setting • Businesses are the focus of the evaluation • New Markets Tax Credit (NMTC) Program Preliminary Evaluation Activities (CDFI Fund, Dept. of the Treasury) • Relatively new program with broad goals • Interest in community impacts • Capital Access for Women: Profile of U.S. Best Practice Programs (Kauffman Foundation) • Interest in business outcomes and individual outcomes

  5. Why are you doing this? • Federal performance reporting • Required by funder • To obtain more funds/expand program (apply for grants; lobby legislature) • To improve program • To compare and select best approaches or methods There may be more than one reason--the main thing is to know the purpose of the evaluation.

  6. What do you want to know? • What is your unit of analysis - household or firm vs. geographic? • What is your timeframe? • Do you have clearly stated objectives? • Do you have specific outcomes in mind? • # of jobs created, quality of jobs, minority ownership of businesses • financing for building rehabilitation, new construction, reduced vacancy rate • increase in neighborhood services and businesses

  7. Relief is on the way....

  8. Sample Study: SBA Loan Programs Purpose: • To enable SBA to assess its progress in meeting agency objectives Research questions: • Does SBA assistance help the firms that receive it? • To what extent does SBA assistance serve its market and produce some effect on it? • Do SBA programs duplicate or overlap with other private and public sector programs?

  9. SBA Performance Analysis • Approach: estimate performance of assisted businesses over time, controlling for key business characteristics • Go beyond previous studies of program effectiveness in four respects: • Use of better-quality data on firm outcomes (Dun & Bradstreet data) • Larger sample sizes -- allows for more precise estimates • More closely matched comparison groups • Explore multivariate analyses that take into account levels and trends of outcome indicators before and after SBA assistance

  10. Evaluation of NMTC • Program purpose:to attract capital to low-income communities by providing a credit against Federal income taxes for qualified investments. • Evaluation purpose:articulate the benefits that the NMTC Program may bring to low-income communities. • Program evaluation in the aggregate --not evaluation of individual projects. • Requires multiple methods and data sources (qualitative and qualitative).

  11. NMTC: challenges and approaches • Diversity of Programs and Outcomes • Outcomes will vary by type of project • Using data reported to the CDFI fund and qualitative data collection, develop typologies of programs and logic models by project type/purpose • Select key outcome indicators and apply using the typology

  12. NMTC: Process Evaluation • It’s a fairly new program that has not yet been evaluated • Describe operating procedures and how they evolve over time • Record all significant development activities over the period of study • Ask respondents why they selected a project • Ask respondents to about how the deal came together • Use project data to quantify intermediate and short-term outcomes

  13. NMTC Community/Social Outcomes • Use qualitative and quantitative data to address role of NMTC, choices and trade-offs, project context, projected community effects. Possible components: • Financial analysis--compare to other financing • In-depth discussions with key stakeholders • Site visits • Independent review of portfolios of selected projects • Quantify range of costs and benefits to the extent possible • But, what are suitable comparisons?

  14. Access to Capital for Women • Study of best practice programs providing capital access to women • At the program level, is the goal financial health/sustainability or mission-oriented outcomes? • Should program outcomes measure individual self-sufficiency or economic development?

  15. Diversity of programs requires a range of evaluation metrics • Organization type • Client population • Types of businesses funded • Program goals • Client goals • Services provided Evaluate effectiveness and efficiency, not program self sufficiency

  16. Evaluation Challenges • Data sources • Measures • Place-based vs.people-based strategies--unit of analysis • Time period • Cost • Identifying a suitable comparison group • Causality

  17. Causality??? The real headache • Counterfactual = what would have occurred in the absence of the intervention • Random assignment provides a reliable counterfactual, but is rarely feasible in community development programs, especially in evaluating broad interventions • You can’t “prove” that a program works when there is no method to provide sufficiently rigorous estimates -- what is the standard of proof?

  18. What will you do with the information? • The findings might surprise you. • The findings might be inconclusive. • Others may use/interpret the findings differently. • Expectations about the magnitude of change may be unrealistic - most programs do not have sufficient resources to have a measurable impact on the problem.

  19. Summing Up • Innovation is messy and it’s very hard to evaluate. • Understand the limitations at the outset. • Educate your audience • Don’t be fooled by flawed estimates offered as “proof” • Do you really mean “impact” (e.g., implying causality)? • The lack of evidence does not mean that the program “doesn’t work” -- may need more information or more time • Use the information for program improvement.

More Related