1 / 36

TRIANGLE Evaluation of Outcomes Pilot North East Floating Support Services

TRIANGLE Evaluation of Outcomes Pilot North East Floating Support Services. 1 st March 2010. Key Stages. Aims and Objectives. Reduce administration for SP and Triangle 3% reduction in staff time spent on admin tasks re. monitoring and maintenance of contract

dwayne
Download Presentation

TRIANGLE Evaluation of Outcomes Pilot North East Floating Support Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TRIANGLE Evaluation of Outcomes PilotNorth East Floating Support Services 1st March 2010

  2. Key Stages

  3. Aims and Objectives Reduce administration for SP and Triangle • 3% reduction in staff time spent on admin tasks re. monitoring and maintenance of contract • Reduction in paper based transfer of information

  4. Evaluation - Administration • Significant administration requirement during pilot stage • Uncertain whether there will be any significant savings on administration in the long term • No savings for Co-ordinators • Possibly will benefit officers • BUT effort produces more effective management data • Regular contract management meetings has potential to reduce time spent dealing with queries from SP • Cessation of monthly monitoring good but concerns that these may be re-introduced when SPOCCnet goes live

  5. Evaluation - IT • Some issues with electronic transfer of data – compatibility of e-mail systems • Computerisation of data lends itself to more effective performance reporting • Need for providers to invest in appropriate IT systems to operate outcomes monitoring • SPOCC.net pilot not applicable to Floating Support or outcomes monitoring, so far • IT training required for all floating support staff • Some concerns about extent to which paperless approach can be deployed to support planning process (client led service)

  6. Aims and Objectives Develop better links between service provision and NIHE North East Homelessness Strategy • Produce plan to determine optimum service model • Expand to all districts and provide equitable access to all districts

  7. Evaluation – Service Model • Revised budget/remodelling plan to be approved for implementation from April 10 • Collection of data through pilot has enabled more systematic identification of gaps in service provision • We have reviewed how we promote/raise awareness of the service with all districts and other referral agents • NIHE review of homeless strategy not yet complete – additional work required to identify opportunities and resource implications and more effective liaison with area homelessness manager is essential

  8. Aims and Objectives More efficient and effective contract management • Introduction of risk based and action plan driven contract management process • Group contracts to allow greater flexibility of resources and reduce waiting lists by 10%

  9. Evaluation – Performance Monitoring • Satisfied with risk score and elements which made up risk assessment • Reduced regulation not realised during pilot but proposed level of regulation should be proportionate

  10. Evaluation – Contract Management • 3 contract management meetings held • Format and structure of meetings was appropriate • Overall experience was very positive • 2 way discussion and personal contact is the single biggest benefit • Discussions were constructive • Focused on outcomes but greater degree of pre-planning required • Service improvement plans were developed and actioned

  11. Benefits - Contract Monitoring • We believe the process helped to improve communication between Triangle and Supporting People • It has ensured a greater focus on partnership working • Triangle was able to place its performance in context • Balanced scorecard – worked well as performance information aligned to issues SP need to discuss as part of contract management • Enables a focus on value of service rather than cost • There was a clear focus on improving the services

  12. Evaluation – Validation Visits • Validation visit to Ballymoney in Sept 09 • Some uncertainty over the structure required due to lack of clarity about how to complete QAF2 • Formal report has not been received – need for clear timescales for production of reports • Areas for improvement were identified verbally on the day and all actions have been implemented, however, these have never been formally developed into a service improvement plan. This needs to happen to ensure effective contract management meetings • No issues with regularity of visits as proposed under the new framework

  13. Evaluation – Grouping of Contracts • Should be benefits if only 1 validation visit for 1 aspect of service • Should only need to complete strategic relevance documents for service under review (not all services in group) • Would be beneficial if grouping extended to services across SP areas • Some benefits were evident in terms of helping the Association to manage its staff resource • Higher levels of staff absence but managed to ensure efficient service provision and effective management of waiting lists • Throughput levels similar

  14. Evaluation – QAF2 • More streamlined, easier to complete but issue with SP staff interpretation • There is a need for consistent and clear completion guidance for providers • Should be benefits if it enable more desktop analysis and less time on site – not evident during pilot • Training provided by SP was useful • We believe training should to be rolled out to non-management staff

  15. Aims and Objectives Improve links between contract and outcomes • Establish system to capture outcomes info at strategic and service level • Improve management information to demonstrate where service users are at a point in time • Use information to inform staff development

  16. Benefits of Outcomes Focus • It has helped to focus attention on ‘why’ we are providing the service not ‘what’ we are doing • It has helped to inform the strategic focus and direction of the service by clearly identifying those areas that are most relevant to the clients we support • It has helped us to provide more focused support that is tailored to clients’ needs. This is evidenced by reduction in average length of support • It has helped to identify the key areas that we need to focus on when planning staff training and development . Ultimately this should help us to target investment • It has been received very positively by staff and clients

  17. General Comments - Outcomes Monitoring • Outcomes monitoring need to be considered as an evaluation tool rather than a performance tool • Focus should be on learning and improvement of services • Can be used to identify trends in referrals and changes that need to be made to the service • Should enable provider and SP to track repeat returners and reasons why • Our experience has been that the average length of support has reduced from 9 to 6 months • Evidence of provision of more focused support but • Are we spending enough time on softer outcomes? • Is this good or bad? – There is clearly a need to track returner rates in the long term!

  18. Triangle’s Outcomes Indicators

  19. Evaluation of Indicators Selected • 10 indicators were chosen to reflect the generic service • The evidence shows that 3 are clearly relevant to the strategic focus of service • The other 7 are clearly less relevant (some more/less than others) • They are still needed as the services are generic • But they should be viewed as supplementary indicators (There should be a hierarchy of indicators) • Some indicators need to be changed • Maintaining tenancy indicator doesn’t reflect the advise and support given to those looking for housing and does not fully reflect the effectiveness of the support provided

  20. Learning from Analysis of Data • Simple analysis of % of clients achieving maximum score does not fully reflect the impact of the service • An analysis of all the scores achieved (see charts) demonstrates a range of positive outcomes for clients • The outcomes achieved need to be viewed in the context of objectives agreed with clients e.g. clients may only want an outcome that equates to a 3, therefore, a client focused service should aim to deliver this! • There must be a fundamental link between the data captured through outcomes monitoring and client satisfaction • 100% of clients surveyed at service exit said they were better able to mange their lives • Therefore, the support provided was effective and the outcomes achieved were relevant and appropriate • But this would not be immediately evident by just looking at the % who achieved a score of 5 at service exit

  21. Learning from Analysis of Data • What is a reasonable target level for % of clients achieving a score of 5? • Would setting such a target be counter-productive? • As above, outcomes are only one part of the picture and should not be used in isolation to judge service performance • It is vitally important that the impact of client non-engagement and external factors (especially availability of specialist external services) is factored into analysis • Evidence from this pilot shows that where full outcomes were not achieved, this was outside of the Association’s control therefore in terms of service effectiveness, in almost all cases, the % supported to achieve full outcome was 100%

  22. Learning from Analysis of Data • Need to factor client needs into analysis • Mental health, drug/alcohol use, complex needs = cyclical behaviour • There was clear evidence of higher outcomes according to demographics e.g. higher level of non-engagement in Ballymena • How can the impact of crisis intervention be measured? • We have found that a significant level of work was required at the early stages with little evidence of long term work needed • Additional analysis may be useful to identify issues/trends • Outcomes by primary client need • Outcomes by source of referral

  23. Evaluation - Distance Travelled • Monitoring of distance travelled not really applicable due to length of support (average of less than 1.5 quarters in service) • However, it could be used to demonstrate the progress made in terms of supporting those who did not achieve a score of 5

  24. Outcomes Monitoring - Other Issues • Support plans and documents inc. exit surveys need to support outcomes monitoring • Need for internal quality assurance role to ensure consistency of data reported across all services • Providers need to use data for internal performance management and evaluate data before contract meetings • Need to develop database to ensure more effective reporting • SP cannot facilitate ongoing IT support • No need for elaborate systems but database essential! • Need to ensure that all staff receive IT training • Co-ordinators definitely need to be IT literate

  25. Full Outcomes Data For Pilot Period January – December 09

More Related