1 / 37

Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU

Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU 4 th July 2019. Learning outcomes for the event: Understand evaluation and the role of logic models in the first step of the planning process. Understand how to create a logic model.

emmy
Download Presentation

Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU 4th July 2019

  2. Learning outcomes for the event: • Understand evaluation and the role of logic models in the first step of the planning process. • Understand how to create a logic model. • Understand how to use a logic model to inform the evaluation planning process, including: • Confirming key stakeholders. • Identifying the scale and scope of the evaluation. • Identifying what data are already available/being collected. • Identifying what types of data to collect and when.

  3. Systematic reviews Guidance, recommendations, professional standards Evidence Ecosystem Primary research, real world evidence, big data Examples: (1) The Digital and Trustworthy Evidence Ecosystem (2) How to achieve more effective services: the evidence ecosystem (3) Evidence for the Frontline Real world evidence

  4. Evidence synthesisers Universities Government departments NGO/Charities Private Sector NICE/‘What Works’ Centres Evidence processors and disseminators Professional bodies/networks Policy organisations NGOs, Media, Private Sector Local Government Actors in the Ecosystem Evidence producers Universities Government departments Research Councils Private Sector Primary research, real world evidence, big data Evidence implementers Practitioners Professional bodies/networks Local commissioners

  5. Evaluation in the evidence ecosystem “Choosing an evidence-based intervention is the foundation, but there are additional necessary tools that adept agencies/organisations must wield to successfully construct an intervention program.” Dr Carolyn Webster–Stratton

  6. Commissioning Cycle https://www.england.nhs.uk/participation/ resources/commissioning-engagement-cycle/

  7. What is evaluation? Conducted to define or judge current care. Explore current standards. Measure service without reference to another. Involves an intervention which is designed and delivered in accordance with guidance, professional standards. Involves existing data but may include new data. No allocation to an intervention. No randomisation.

  8. Why evaluate? Assess service users and or service providers actual experience of a service. Assess how the objectives of the service or project are being met and any areas where they are not. Assess value for money. Assess whether a service is progressing according to plan. Identify opportunities for improvement. Document lessons to be learned for others and for the future. Establish a baseline of performance against which the impact of future initiatives can be compared.

  9. What are the questions you need to answer? • Are things going according to plan? Why or why not? • Are there things we could do to improve what we are doing? • Is what we are doing making any difference? How do you know? • Is the difference we are making worth the time/effort/money? Can you show this?

  10. Why think about this from the start? What is the plan? Are you collecting the right information from the beginning that will help you understand why it is working (or not)? • Are things going according to plan? Why or why not? • Are there things we could do to improve what we are doing? • Is what we are doing making any difference? How do you know? • Is the difference we are making worth the time/effort/money? Can you show this? Are you setting up the necessary process from the beginning to identify areas for improvement as you go along? Have you thought what success will look like before you start? What information do you need to gather to demonstrate this when the time comes? Will you keep a detailed record of the resource invested so that you can make this judgement further down the line?

  11. Be planned from the start Evaluation needs to… Collect data, reflect and refine throughout the life cycle of the programme

  12. What to include What to exclude Identify scope

  13. Evaluation needs to… Have specific aims and objectives. Have a clear purpose and focus. Have a clear time-frame. Use stakeholder involvement(PPI). Ideally use mixed-methods. Have clear reporting deadlines.

  14. Identify stakeholders (individual, significant others, wider stakeholders) Have clearly defined roles, responsibilities, resources Consider different perspectives

  15. Process and outcome evaluation should be carried out throughout to ensure ongoing programme development e.g. Are the right people attending? Is the targeting appropriate? Is the delivery right for your population?

  16. Logic models help with this… • Identify programme gaps in activity • Identify if the right data are being collected

  17. This process (a logic model) helps identify programme gaps in activity and whether the right data are being collected to evidence outcomes. Can be desk based or captured with stakeholders during a meeting. This is your theory of change. i.e. the delivery of these activities will achieve these outcomes in the short, medium and longer-term. Evaluation will then test whether this happens and explore how and why.

  18. A convincing picture of what you are trying achieve that shows the links between your intended activities, outputs and outcomes A framework for integrating planning, delivery and evaluation It’s not reality but your best prediction of what needs to happen to get to your outcomes Part of a wider planning and performance cycle What are logic models?

  19. What DOES A logic model LOOK LIKE? Display of boxes and arrows, vertical or horizontal Any shape possible Level of detail – simple or complex

  20. Input The stuff that is done Output The results that are seen Outcome The impact you are looking for

  21. Why use them? • Evidence-based story telling (road map) • Communicate your (agreed) vision and plans • Provide clarity re activities and outcomes • Engage and sustain stakeholders (inspiration) • Aid planning and management • Focuses and improves implementation • Helps you know what and when resources are needed • Highlights assumptions and risks • Shows similarities and differences to other programme activities • Links with bigger picture

  22. Developing a logic model… Define the outcomes The changes achieved as a result of the activities

  23. Developing a logic model… Define the activities What does the programme actually do?

  24. Developing a logic model… Define the outputs The countable products

  25. Input The stuff that is done Output The results that are seen Outcome The impact you are looking for

  26. Now, lets consider whether we are collecting the right data to evidence whether the outcomes are achieved. • Use the arrows to connect • - the activities to outputs • - and the outputs to outcomes

  27. Now consider the gaps… How to prioritise what to evaluate? RE:AIM (Glasgow, Boles & Vogt, 1999) Reach Effectiveness Adoption Implementation Maintenance

  28. Key points: Develop a shared sense of purpose amongst key stakeholders – identify and acknowledge roles and responsibilities within the delivery of a programme. Includes commissioners but also others who would benefit and/or be affected by the delivery of a programme. Who will analyse, collect and report on the data? Up to 10% of a programme budget should be set aside for evaluation.

  29. Next steps: Designing data collection tools How and when to collect different types of data How to analyse and interpret different types of data

  30. Types of process evaluation data to collect and may include: • (qualitative: interviews, focus groups, surveys, monitoring data) • Service user: • How did they find out about the service? • Why did they attend? • How easy was it to attend? • What was their experience of the service? • Were their needs met? • Reach • Service provider: • How easy was it to implement the service? • Non-service users: • Awareness and barriers to use

  31. Types of outcome evaluation data to collect may include: (quant & qual: interviews, focus groups, surveys, monitoring data) Service user: Achievement of intended outcomes Unintended outcomes Impact on quality of life Service provider: Intended and unintended outcomes Wider system-level outcomes: Impact on partnerships and pathways Do other organisations benefit from the intervention? Significant others: Impact on quality of life

  32. More information: Public Health Institute Faculty of Health, Education and Community Liverpool John Moores University 0151 2314382 H.Timpson@ljmu.ac.uk

More Related