1 / 25

Service User Survey Results 2011

Service User Survey Results 2011. Toni Martin – Senior Consultant Quality Health. Context. NHS Plan: Mandatory annual national surveys (patient and staff) Links to Vital Signs Conclusion: Patient and Staff surveys are a major source of data for CQC evaluating Trust performance

dayo
Download Presentation

Service User Survey Results 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Service User Survey Results2011 Toni Martin – Senior Consultant Quality Health

  2. Context • NHS Plan: • Mandatory annual national surveys (patient and staff) • Links to Vital Signs • Conclusion: Patient and Staff surveys are a major source of data for CQC evaluating Trust performance • CQC’s Role: • Organise and run the national surveys – DH may take over the programme in 2012 • Publish comparative data for each Trust to inform public and guide funding. CQUIN scheme linking Patient Experience data to funding may be extended to MH • Inspection: sometimes unannounced or short notice - c.1 in 5 chance of being inspected in any one year. Inspections becoming more rigorous • Patient Reported Outcome Measures: • MH community services could be in scope for industrial scale PROMs

  3. Operating Framework & Vital Signs • Vital Signs on patient experience: • Self reported experiences (national priority for local delivery, i.e. PCTs decide on precise content of local Vital Signs) • AND respect and dignity ratings • “Success” defined as increase in index score for each survey (national priority indicators guidance) • Operational Plans Technical Guidance Master makes clear that patient experience will be gauged using questions in 4 patient survey domains: 13 questions, flashed in this presentation

  4. Key Points • Establish: differences between service users and managers / clinicians views of service • National Service User Surveys 2003 - 2011 used essentially same methodology • Substantial revision to questionnaire in 2011 • Comparison here with 15,159 Service User respondents from core samples in 56 MH function Trusts. • 86% of MHTs used QH • National Response Rate: 32%. Range is 26% - 42%; London and metropolitan areas lowest; shire counties with lowest levels of deprivation have highest response rates

  5. Performance Issues Performance differences between Trusts caused by: • CQC standardise data only by age and gender. Our official study shows need to standardise for ethnicity or use unstandardised data. • Differences in practice and quality – but also: • Differences in social composition • Ethnicity: Asian and Black service users have different pattern of contact – less use of TT, more contact with Psychiatrists. Asian patients less positive about care planning, CCs, fewer have access to out-of-hours service; Black service users have double the rate of sectioning than whites • Age:Young patients less satisfied than older patients: up to 18 point difference on information issues etc between 16-24s and 64-75s • Gender: Women less positive than men • Still big differences perceived in service quality between geographically based teams in some Trusts • Differences in composition of CPA register: % of service users on new CPA ranges from 100% to 8% between Trusts in this survey • Too soon to tell how differential implementation of new CPA system will affect the data. Much looks unchanged from previous years • Under old CPA rules, enhanced CPA service users were much more likely to be aware of care plans, care co-ordinators, out of hours phone numbers, than those on standard CPA • So, the kinds of service users in your sample are crucial

  6. CPA Range

  7. National Trends 2003-2011

  8. National Trends 2003-2011

  9. National Trends 2003-2011

  10. National Trends 2003-2011

  11. Key Scores in 2011

  12. Respondents Details

  13. Contact & Staff

  14. Staff Attitude

  15. Medications

  16. Talking Therapy

  17. Co-ordinators & Care Plans

  18. Care Reviews

  19. Crisis Care

  20. Day to Day Living

  21. Overall

  22. Comparisons with National Data

  23. Movement 2010-11

  24. Issues for Action • Better information on medication purposes & side effects, decisions on meds where possible • Ensure that all medication is being effectively reviewed congruent with your clinical guidelines • Assess unmet need for talking therapy • Further improve knowledge of who the care co-ordinator / lead professional is and awareness of how they can be accessed • Continue action to ensure all get a hard copy of care plan, understand contents, and ensure formal updating at least annually • Further improve incidence of care reviews - many say they haven’t had one in last year • Further extend help to those wanting it on finding work, benefits, housing • Ensure all service users have access to out-of-hours support telephone number • Ensure that enough information and support is given to families and carers

  25. The Next Steps • Integrate with Quality Account and ensure Vital Signs action plans are in place. • Specific action plans in place to deal with top service user related issues. Build a performance management system which makes managers accountable; top improving Trusts pick 3-4 issues at most and rigorously performance manage them from the top. • Lead the process within the Trust. Keep the pressure up, don’t stop. Repeat messages. • Consider tracker surveys on community and IP services • Publicise achievements.

More Related