1 / 10

Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013

Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013. Leroy White University of Bristol Capacity Building Cluster. Capacity-Building Cluster The economic impact of the third sector. Funders. Academics. Partners include….

frisco
Download Presentation

Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Impact: lessons from the capacity building clusterSWF Impact Summit2nd October 2013 Leroy White University of Bristol Capacity Building Cluster

  2. Capacity-Building Cluster The economic impact of the third sector Funders Academics Partners include…

  3. Capacity Building Cluster For partners • Assess the impact and value of what the organisation does • Use data and analysis to improve service delivery For academics • Develop better insights into organisations • Use data and analysis for research questions For all • Build capacity in using and conducting research Through • KTPs, PhDs, Placements, vouchers

  4. Examples of projects • South West Forum: Demonstrating the economic impact of social purpose organisations through a range of small projects • Action for Children and National Youth Agency working with young people not in education, employment or training (NEET) on financial capability • Citizens Advice Bureaux: Impact of debt advice on outcomes • Crisis: Evaluation of the In Work Staying Better Off Programme • KWADS and SoberlinkTree: Drugs and Alcohol organisations (Eg) working with young people dependent on drug or alcohol • Festival of Nature: Impact of the festival on the local economy (using contingent valuation)

  5. Why measure impact? External • To communicate to funders and commissioners the value of what the organisation does Internal • To understand better what the organisations achieves and what it does well

  6. Our Thinking • Growing appreciation of the links between issues relevant to policy makers and questions of a more general, social scientific nature • Consumers of policy analysis are demanding a stronger base of evidence for policy prescriptions • Funders want to know whether the valuable resources they invest in are paying off • Social scientists are optimistic about the potential of (quasi) experimental approaches to yield valuable new insights on impact of third sector institutions

  7. Our thinking • Few rigorous impact evaluations conducted on third sector projects. Why? A lack of comparison or control, small sample size, and short-term nature of the interventions, etc. • The “evaluation problem” - the true counterfactual, what would have happened to beneficiaries had they not participated, cannot be observed We suggest the notion of a quasi-experimental approach, i.e., consider the intervention itself as an experiment and find a naturally occurring comparison group to mimic the control group

  8. Our thinking • The choice of method depends mainly on practical concerns including the characteristics of the project and the nature and quality of data that are available. • There are techniques that generate comparison groups that are not randomly chosen, but are selected so that they closely resemble the beneficiary group, at least in observed. characteristics. • In these designs, project participants are compared to non- participants using methods to account for the differences between the groups and to correct for the selection bias that might arise from non-random allocation of benefits.

  9. Our thinking There are a wide range of issues that support our choice of methods for evaluating third sectors projects. • First, projects are not a homogeneous programme of interventions. There are considerable differences in terms of level, delivery and content of different frameworks. • Second, projects may vary in quality, often reflecting the nature and practices of the sector in which projects work. • Finally we recognisethat many projects either have short timescales (often they are one year interventions) or require feedback to policy makers or funders over a short-term.

  10. Personal reflections • All methods and approaches are wrong • Be deeply pragmatic • Aim to describe the “conditions of possibility” for good outcomes to occur • Be aware of the political agenda for understanding impact • Co-produce with HEIs

More Related