1 / 18

Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off?

Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off?. Ruth Thomas – Derwen College Claire Dorer - NASS. Outline. To explore the concept of outcomes and how they relate to ISCs politically and in practice

reynar
Download Presentation

Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off? Ruth Thomas – Derwen College Claire Dorer - NASS

  2. Outline • To explore the concept of outcomes and how they relate to ISCs politically and in practice • To consider work by NASS on devising an outcomes framework for special schools • To consider the current Natspec QSR project and outcomes for learners with complex needs

  3. How do you know your school or college is good? • Ofsted / CSCI inspection • Internal QA • Preferred provider with Local Authorities (for schools) • We see the progress children and young people make with us

  4. But… • No national framework for outcomes • What providers value as outcomes are not always LA priorities or what LSC asks us to measure • Confusion and disagreement over what we should be measuring and how

  5. Why measure outcomes? • To show the effect a placement at your school/college has on a child or young person. • To show potential placers that you provide a high quality service – this will be an important new dimension for colleges from next year But- can you meet both functions with the same set of data?

  6. What do you want to know? • Output – a tangible result of a given action e.g. a safeguarding policy or new training programme for staff • Outcomes – the result of that output e.g. decrease in adult protection referrals or 95% of all staff trained • Impact – what difference does this make to the young people receiving the service e.g. young people report that they now feel more confident sharing issues with staff and they now feel happier at college

  7. Who wants to know it? • Outputs are relatively easy to measure. For monitoring, it’s feasible for LAs / Ofsted to ask to see policies, minutes of meetings etc. This is useful in setting minimum requirements for providers. Most schools & colleges currently receive this level of monitoring from LAs/Ofsted who purchase places from them. • Outcomes are more difficult and tend to require a degree of evaluation. Providers may or may not value the information requested and LAs / Ofsted currently appear to lack the capacity to monitor and to make comparisons between providers. • Impact is most difficult to measure – doesn’t lend itself to quantitative evaluation simply BUT provides the richest and most useful information about what an ongoing placement means for an individual young person.

  8. NASS and Outcomes • Work within a suite of national contracts – the contracts for children’s homes and fostering both have outcomes framework and pressure for schools contract to have one too. • Recognise the need LAs have to make evidence-based decisions about placements • Want to develop a framework that moves beyond numbers and percentages to demonstrate the impact of placements on children and young people

  9. What we concluded • KPIs needed to be small in number and relevant to all settings • We need to work on cultural change with commissioners in order that qualitative, person-focused outcomes are valued as highly as quantitative data • We want a focus on the consistency and rigor of the process, whilst leaving flexibility around the content.

  10. NASS’S Outcomes Principles for Schools • Setting of life goals should be the starting point – before discussions about placement start • These goals should inform placement • The school/college has responsibility for translating life-goals into targets, stages and outcomes within a framework of ECM outcomes • Existing reviews and monitoring activities should focus on progress against outcomes • Schools and colleges can measure impact – have goals been achieved. What might this young person’s life be like if goals are partially achieved or not achieved?

  11. Tools • Schools about to get access to SEN Progress guide that helps measure progress of young people with severe and complex LDD • NASS producing self-assessment toolkit to support schools with gathering and monitoring data • Hoping to link to external QA projects such as BILDs Quality Networks reviews

  12. Natspec and QSR • A National ISC PRD group (National Star, RNCB, Treloars, Henshaws & Derwen) last year began looking at measuring the success rates of learners with complex needs. This was to be based on the rigorous measurement of progress made towards long term goals in any areas from PSD, functional skills, vocational and employability skills and ILS, enabling confident benchmarking across the sector and in GFE LLDD provision. • The data would to used to benchmark across the sector, including GFE LLDD. In addition the data may be included in ILR and FfE. • Latterly the group was joined by representatives from AoC, NASS & Ofsted.

  13. The Project The challenge: To produce quantitative data on personal/individual success that allows comparison across providers. The Proposal: To produce annual data on the achievement of predicted Every Citizen Matters (ECM) outcomes for individual learners, identifying the numbers and percentages of learners who are ‘ahead/over’, ‘in line with/on’ or ‘behind/under’ the learning needed to meet their goal, amalgamated for the provider as a whole and against each ECM theme.

  14. The Benefits Allows personalisation within a nationally recognised framework for consistency (RARPA plus ECM) Measures success in outcomes which are valuable to learners and which are controlled by the provider Supports self-assessment and evidences ‘distance travelled’/value added Does not prescribe or constrain curriculum offer, programme or provider type, enables links to FLT Links to local authority outcomes and Ofsted inspection Measures success in outcomes which are valued by stakeholders and commissioners

  15. Next steps Pilots to test process, establish guidance and criteria for levels of performance Guidance on process including what might be included under each ECM outcome, including PI’s Guidance on how to best contextualise the data including use of evaluative criteria based upon CIF. Parameters for small numbers of learners Clarify definitions of complex needs and learners for whom this approach is appropriate Validation and quality assurance (requires robust RARPA processes and self-assessment with validation through peer review and external tests through Ofsted). Establish links to ILR

  16. Other recommendations The use of destinations against predictions could be a useful indicator but should not be used as a measure of success as there are too many issues outside the control of the provider. Students who die or whose health deteriorates such that continued attendance is impossible should be removed from success rates and retention data Your thoughts, ideas, suggestions would be welcomed.

  17. Points to consider Learners individual learning goals and ECM outcomes? Are your RARPA processes robust? How do you achieve this? The concept of ‘ahead/over target’, ‘in line with/on target’, behind/under target. Consider the use of percentages in data collection. What would you change or add to the pilot? Any further suggestions, recommendations or comments?

  18. Thank you

More Related