1 / 61

Building a Lighthouse In a sea of data

workshop . Lighthouse data. Building a Lighthouse In a sea of data. IHI APAC FORUM New Zealand - September 2012 Richard Hamblin – HQSC Andrew Terris – Patients First. Workshop. By the end of this session you will be

sirvat
Download Presentation

Building a Lighthouse In a sea of data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. workshop Lighthouse data Building a Lighthouse In a sea of data IHI APAC FORUM New Zealand - September 2012 Richard Hamblin – HQSC Andrew Terris – Patients First

  2. Workshop • By the end of this session you will be • able to identify the focal points to drive a small number of measures that will influence quality • equipped with an approach to ensure measures have broad input while remaining consistent in definition and purpose • aware of some of the pitfalls of selecting and combining measures to stimulate improvement

  3. Answering the Key Questions How do you know you are any good? How do you rank with the best? How do you know you are improving?

  4. Agenda Why measurement matters Why it can go wrong How to avoid this How Health Quality Measures New Zealand helps

  5. The four stages of denial • The data isn’t right! • The data is right but is isn’t relevant to me! • The data is right and it is relevant to me but there’s nothing I can do about it! • The data is right • It is relevant to me • I can do something about it • Better get on with it

  6. Why measurement matters ‘We can only be sure to improve what we can actually measure.’ Raleigh and Foot 2010 ‘Evidence suggests that publicly releasing performance data stimulates quality improvement activity at the hospital level.’ Fung et al 2008

  7. But measurement of itself is not improvement

  8. How is reporting performance supposed to work? Publicly reported performance data Knowledge Selection Change Motivation Performance Effectiveness of care Safety Patient Centredness Unintended consequences From Berwick, James and Coye 2003

  9. Links to incentives • Intrinsic – altruism/professionalism • 60k want to do the right thing • Implicit – kudos and censure • It’s not my DHB is it • Indirect – market share/ career enhancement • Gongs and gold • Direct – P4P, targets with explicit rewards and sanctions • Punished by reward?

  10. But sometimes the change is perverse NHS ambulance response times New York Cardiac Surgery P4P in California Mid Staffordshire Hospital

  11. Why? Rarely because of bad people Bevan and Hood, 2005

  12. So, why does it go wrong? Wrong approach Wrong framing Wrong system incentives Wrong measure Wrong construction

  13. A brief word about data quality Data quality a cop-out Get the rest wrong and perfect data will still not work Get the rest right and imperfect data will still be useful

  14. Wrong approach Framing systems help to avoid this e.g. Donabedian (structure, process, outcome) Carter and Klein (tin-openers and dials)

  15. What structure process and outcome actually is Structure is what you put into a system Process is what you actually do Outcome is what happens as a result SO Use of a specific electronic system is not a process Clinical actions are not outcomes

  16. How outcome contextualises process • “Hitting the target and missing the point” • Failure to recognise improvement • Gaming and other nasty things • But did it make any difference? Elation Process Despair

  17. Conclusion – process is not a proxy for outcome – doing the “right” thing may not lead to a desired outcome And you may end up incentivising some pretty damaging behaviour

  18. NHS Ambulance response times 75% of life-threatening calls responded to in 8 minutes – a clinically relevant process

  19. NHS Ambulance response times 75% of life-threatening calls responded to in 8 minutes – a clinically relevant process ‘Corrections’’ only 2% to 6% 75% < 8 minutes 75% < 8 minutes Source: http://www.chi.nhs.uk/eng/cgr/ambulance/index.shtml

  20. How outcome contextualises process

  21. Variations in outcome • At least four causes • Genuine variation in quality • Unadjustable variation in casemix • Local issues or organisation and recording • Statistical noise • Or some combination thereof – and you don’t know which is causing the variation from the data alone

  22. Implications – 1) setting targets for outcome impossible 2) simplistic benchmarking of outcomes counter-productive

  23. How outcome contextualises process Outcome ?Hitting the target and missing the point ?Is there a new problem Looks to be working (but keep watch out for confounders!) Process ?What else is happening ?Regression to the Mean Get on with it!

  24. Tin openers and dials • Concept from Carter and Klein (e.g. 1992) • Tin openers open up cans of worms • Dials measure things • Most of the time you need to ask the right questions as much as you need to get the right answers

  25. Use a tin-opener as a dial and this happens

  26. This affects real life

  27. Wrong framing Four frames of measurement Aggregation, aggroupation and synecdoche

  28. The four frames for measuring

  29. Aggregation, Aggroupation, Synecdoche Summary in a sea of data Aggregation – some sort of arithmetic summary of multiple pieces of data Aggroupation – organisation to present a (usual visual) analogue Synecdoche – assuming the part represents the whole

  30. Aggregation - Risks Hospital X Quality Scorecard HSMR 97 Patient Satisfaction Rating Top Quartile Days since CLAB 38 Overall Grade B+

  31. Even if you’re sophisticated you still have the same problem

  32. Aggroupation

  33. Synecdoche What if the part isn’t the whole? e.g. Low waiting times do not equal high quality

  34. Wrong system incentives There are times when intrinsic motivation isn’t enough

  35. Measures For? Or Judgement Improvement

  36. Measures For? http://www.kingsfund.org.uk/publications/quality_measures.html

  37. So, how do you view the Three Faces of Performance Measurement? As a… As… Or, Research Improvement Judgment

  38. …BUT MRSA too Baseline year Target year

  39. Exercise 1

  40. Wrong measure Doesn’t measure what it sets out to Impossibility of data collection Ambiguity of interpretation Over-interpretation

  41. Examples Unacknowledged use of structural measures as proxies for outcomes Average LOS as a measure of whole system productivity Mortality rates as direct measures of quality

  42. Wrong construction Technical checklist Threshold effects

  43. Technical checklist Numerator/denominator consistency Standardisation – when and how Exclusions Attribution

  44. Threshold effects Ambulances ED/A&E time limits Measure distributions and measure effects Seriously consider whether direct incentivising around the threshold is a good idea

  45. Exercise 2 take offline in the interests of time

  46. Applying the ScienceHealth Quality Measures NZ

  47. The Approach Creating measures that matter – the “pull” of demand from various health sector stakeholders Democratising the process – balancing top-down consistency with bottom up feedback and design Aligning information – road testing measures Considering the system view – balancing process, system and outcome measures The information considerations – dataset, utility and relevance at local and aggregate levels

  48. Patients First Framework Outcome 1 Outcome 2 Outcome 3 Informed\ by ... Creating the “pull” through a quality lens Model 2 Model 1 ... Measurement Evaluation Prioritisation Quality Improvement Inter-op Dataset Shared Care ... Informs PMS Requirements GP2GP Pathways ... Information

  49. Key Principles Collaborative/peer to peer Share experience/expertise Promote transparency Be sustainable Avoid duplication Create a system that can evolve Web based

More Related