1 / 13

Resilient Communities: Useful findings of innovative approaches

Resilient Communities: Useful findings of innovative approaches [Place Based Evaluation Literature Review] Literature Review carried out for UnLtd by Jen Dyer and Claire Bastin: October, 2018. Contents. Summary of Literature review: main findings (slide 3)

gamba
Download Presentation

Resilient Communities: Useful findings of innovative approaches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Resilient Communities: Useful findings of innovative approaches [Place Based Evaluation Literature Review] Literature Review carried out for UnLtd by Jen Dyer and Claire Bastin: October, 2018

  2. Contents • Summary of Literature review: main findings (slide 3) • Evaluating change: two useful examples of metrics & measures (slide 4) • Dissemination & communication: findings (slides 5-6) • Summaries of place based evaluation (slide 7) • Examples of place based evaluation (slides 8 – 11) • References (slides 12-13)

  3. Summary of Literature Review: Main findings • Limited information about place-based evaluation approaches, metrics, and good practice. • Academic and practitioner literature is ‘light on detail around metrics and methods used for evaluating PB approaches’. • Variety of metrics used by different organisations = little standardisation = makes it difficult to aggregate the impact of PB work or compare and contrast different approaches. • Focus on outputs rather than impact. • Output metrics don’t tell us what has changed or what difference a place-based intervention has made. • Limited evidence about effectiveness of PB approaches. • Evaluators don’t pay sufficient attention to dissemination and communication of generalisable findings or learning (including what doesn’t work). • Bastin & Dyer found that the literature shows place based approaches are seen as being ‘complex and difficult to evaluate’.

  4. Evaluating change: two useful examples of metrics and measures • Example 1 : MacLellan-Wright et al. (2007) developed a ‘Community capacity building tool’ in which they co-defined 9 domains of community capacity against which communities can map themselves at various stages of the project. The domains were: participation, leadership, community structures, asking why, resource mobilization, links with others, role of external supports, skills, knowledge & learning, and sense of community. • Example 2: Sridharan and Lopez (2004), advocate an ‘anticipated time line of change’ to supplement the project Theory of Change. This allows mapping of outcomes against time and can be useful in managing expectations of different actor groups

  5. Dissemination and communication - findings • Dissemination is often a neglected area in evaluation design and delivery (Burstein and Tolley, 2011). This is reinforced by the lack of examples of dissemination activity related to evaluation in the literature and practitioner reports. • Good practice recommendations from Burstein and Tolley (2015) include: • Ongoing dissemination in a variety of forms relevant for different groups • A mixture of in-person and remote dissemination • Accessible formats, language and locations • An emphasis on knowledge sharing amongst different actors involved in the project • Involvement of external actors not involved in the project at various stages in order to increase impact and invite in new ideas

  6. Dissemination and communication - findings • Generalisability of outcomes from PB work can be limited but increased through connection with/integration with wider reaching schemes (Minkler, 2009). • Examples of this type include: • NESTA Neighbourhood Challenge Project (2010-13). • Power to Change evaluation of the role of community businesses in providing health and wellbeing services (2017).

  7. Summaries of PB evaluation examples •  The following slides show examples of PB project evaluations and highlight specific learning for UnLtd’s evaluation design. • However it should be noted that all project examples are case and context specific. • In addition, as discussed in the report, general features of PB projects suggest limited focus on metrics and measurable outcomes, this is illustrated in the examples detailed.

  8. References • Brunner, R. Craig, P. and Watson, N. 2017. Evaluability Assessment of Thriving Places: a Report for Glasgow Community Planning Partnership. Available online at: http://whatworksscotland.ac.uk/category/topic/Evaluation-approaches/ •  Burstein, M. and Tolley, E., 2011. Exploring the Effectiveness of Place-based Program Evaluations. Policy Horizons Canada. Available online at: http://p2pcanada.ca/files/2011/09/Place-based-Evaluations_Report_2011_FINAL.pdf •  Cousins, J. B. and Earl, L. 1992. The case for participatory evaluation. Educational Evaluation and Policy Analysis. 14(4), pp. 397-418. Available at: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.462.3344&rep=rep1&type=pdf •  Cousins, J. B. and Whitmore. E. 1998. Framing Participatory Evaluation. New Approaches in Evaluation, (80), pp.5-23. •  Davies, R. 2015. Evaluability Assessment. BetterEvaluation. Available at: http://betterevaluation.org/themes/evaluability_assessment • Lankelly Chase. 2017. Historical review of place based evaluation. Available at: https://lankellychase.org.uk/wp-content/uploads/2017/10/Historical-review-of-place-based-approaches.pdf •  MacLellan-Wright, M.F., Anderson, D., Barber, S., Smith, N., Cantin, B., Felix, R. and Raine, K., 2007. The development of measures of community capacity for community-based funding programs in Canada. Health promotion international, 22(4), pp.299-306. • Minkler, M. 2009. Linking Science and Policy Through Community-Based Participatory Research to Study and Address Health Disparities. American Journal of Public Health, 100(Suppl 1): S81–S87 • Nichols, A. 2013. Evaluating place based programs. Available at: https://www.urban.org/urban-wire/evaluating-place-based-programs • Policy Horizons Canada, 2011. The evaluation of place based approaches: Questions for further research. Available at: http://www.horizons.gc.ca/en/content/evaluation-place-based-approaches • Royal Children’s Hospital, 2011. Place based approaches to supporting children and families. Policy brief issue 23. • Sridharan, S. 2011. A Guide to evaluating place based initiatives. Government of Canada Policy Brief. Available at: http://www.horizons.gc.ca/en/content/top-10-questions-guide-evaluating-place-based-initiatives •  Sridharan, S. and Lopez, E.I., 2004. Methodological lessons learned from the process evaluation of the comprehensive strategy for serious, violent, and chronic juvenile offenders. Social Policy Journal of New Zealand, pp.128-147. • Stumbitz B., Vickers, I., Lyon, F., Butler, J., Gregory, D., and Mansfield, C. 2017. The role of community businesses in providing health and wellbeing services: Challenges, opportunities and support needs. Available at https://www.powertochange.org.uk/wp-content/uploads/2018/07/Health-Wellbeing-open-call-report.pdf

  9. References - continued Websites: • Better Evaluation. Practitioner’s guides. Available at https://www.betterevaluation.org/en and https://www.betterevaluation.org/en/plan/approach/participatory_evaluation • Participate. Ground Level Panels. Available at: http://participatesdgs.org/publications/ •  Power to Change. https://www.powertochange.org.uk/wp-content/uploads/2018/07/Health-Wellbeing-open-call-report.pdf • What works Scotland. http://whatworksscotland.ac.uk/category/topic/Evaluation-approaches/ Additional reading themes: •  Capturing long‐term outcomes and durable change (Foley, 2010; Koontz and Thomas, 2006; Levitan Reid, 2009) • Attribution of outcomes in open systems, as well as attribution of systems change (Bradford & Chouinard, 2010; Koontz and Thomas, 2006). • Attribution and accountability within collaborative governance due to intertwined funding and decision making (Kubisch et al, 2010; Mayne, 1999). • Measuring capacity building, participation, relationships, and behaviour change (Whaley and Weaver, 2010) • Data gaps or logistical challenges in gathering data (Gardner et al., 2010; Koontz and Thomas, 2006) • Competing evaluation philosophies: “objective” external process versus “experience‐based” community process (Bradford and Chouinard, 2010; Potvin and McQueen, 2008)

More Related