1 / 34

Using Data at the Front-line and Across the System

Using Data at the Front-line and Across the System. Session Objectives. To review/understand the purpose, application and structure of Measurement for Improvement To look at how data can be used to drive improvement at the frontline. Why Do You Need Data and Information?.

tyne
Download Presentation

Using Data at the Front-line and Across the System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data at the Front-line and Across the System

  2. Session Objectives • To review/understand the purpose, application and structure of Measurement for Improvement • To look at how data can be used to drive improvement at the frontline

  3. Why Do You Need Data and Information? To plan for improvement For testing change For tracking compliance For determining outcomes For monitoring long term progress To tell their story

  4. Model for Improvement Using Data to understand progress toward the team’s aim Using Data to answer the questions posed in the plan for each PDSA cycle The Improvement Guide, API

  5. Need for Measurement • Improvement is not about measurement. • But measurement plays an important role: • Key measures are required to assess progress on team’s aim • Specific measures can be used for learning during PDSA cycles • Data from the system (including from patients and staff) can be used to focus improvement and refine changes

  6. Why are you measuring? Judgment? Research? Improvement? The answer to this question will guide your entire quality measurement journey! 9

  7. The Three Faces of Performance Measurement Lief Solberg, Gordon Mosser and Sharon McDonaldJournal on Quality Improvement vol. 23, no. 3, (March 1997), 135-147.

  8. Improvement vs. ResearchContrast of Complementary Methods Improvement Aim: • Improve practice of health care Methods: • Test observable • Stable bias • Just enough data • Adaptation of the changes • Many sequential tests • Assess by statistical significance Clinical Research Aim: • Create New clinical knowledge Methods: • Test blinded • Eliminate bias • Just in case data • Fixed hypotheses • One fixed test • Assess by statistical significance

  9. Outcome Measures:Voice of the customer or patient. How is the system performing? What is the result? Process Measures:Voice of the workings of the system. Are the parts/steps in the system performing as planned? Balancing Measures:Looking at a system from different directions/dimensions. What happened to the system as we improved the outcome and process measures (e.g. unanticipated consequences, other factors influencing outcome)? Three Types of Measures

  10. The Improvement Measurement Journey AIM(Why are you measuring?) Concept Measure Operational Definitions Data Collection Plan Data Collection Analysis ACTION

  11. Improvement Measurement Journey AIM–Improved general ward outcomes Concept – Prevent healthcare associated infections Measure – %compliance with hand hygiene Operational Definition – N: total number of opportunities in the sample where appropriate hand hygiene was conducted divided by D: total number of opportunities in the sample multiplied by 100 = % Compliance Data Collection Plan – monthly Data Collection – unit submits data for analysis to area/dept collating data Analysis – Run or Control chart Tests of Change

  12. ConceptPotential Measures Communication Percent staff trained in use of SBAR Percent staff using SBAR Quality of exchange using SBAR Medication Errors Percent of errors Number of errors Medication error rate VAPs Percent of patients with a VAP Number of VAPs in a month The number of days without a VAP Every concept can have many measures

  13. SPSP Measures: General Ward Outcome Measures Crash call rate Staph. aureus Bacteraemias (SABs) rate or Days between SABs Clostridium difficile infection rate or Days between Clostridium difficile infection occurrences Process Measures Percent compliance with Early Warning Score Assessment The percent of patients for which a respiratory rate is recorded each time observation occurs Percent of patients identified as at risk have appropriate interventions undertaken in terms of their management as categorised by the early warning score. Number of calls to the outreach team Percent compliance with hand hygiene Percent compliance with using safety briefings Percent compliance with using SBAR

  14. Measurement Guidelines The question - How will we know that a change is animprovement? - usually requires more than one measure A balanced set of five to eight measures will ensure that the system is improved Balancing measures are needed to assess whether the system as a whole is being improved

  15. Balancing Measures: Looking at the System from Different Dimensions: Quality Transaction (volume, no. of patients) Productivity (time, efficiency, utilisation, flow, capacity, demand) Financial (charges, staff hours, materials) Patient satisfaction (surveys, complaints) Staff satisfaction

  16. Expectations for Improvement When will my data start to move? • Process measures will start to move first. • Outcome measures will most likely lag behind process measures. • Balancing measures – just monitoring – not looking for movement (pay attention if there is movement).

  17. Integrate Data Collection for Measures in Daily Work Include the collection of data with another current work activity wherever possible Develop an easy-to-use data collection form or make Information Systems input and output easy for clinicians  Clearly define roles and responsibilities for on going data collection Set aside time to review data with all those that collect it  

  18. Overall Project Measures vs. PDSA Cycle Measures Data for Project Measures: - Overall results related to the project aim (outcome, process, and balancing measures) for the life of the project Achieving Aim Data for PDSA Measures: - Quantitative data on the impact of a particular change - Qualitative data to help refine the change - Subsets or stratification of project measures for particular patients or providers - Collect only during cycles Adapting Changes During PDSA Cycles

  19. Measurement and Data Collection During PDSA Cycles Collect useful data, not perfect data - the purpose of the data is learning, not evaluation Use a pencil and paper until the information system is ready Use sampling as part of the plan to collect the data to reduce workload Use qualitative data (feedback) rather than wait for quantitative data Record what went well and what didn’t work so well during the test of change

  20. The Problem Aggregated data presented in tabular formats or with summary statistics, will not help you measure the impact of process improvement/redesign efforts. Aggregated data can only lead to judgment, not to improvement.

  21. Unit 1 Unit 2 Unit 3 Cycle time results for units 1, 2 and 3 Unit 2

  22. Using the Data to Drive Improvement • Linking Outcome and Process measures • What does the data tell you? • MRSA & MSSA (annual data) • How many SAB/ line related infections • How many related to PVC or CVC • Working together with frontline staff • Leadership engagement

  23. What can we use to show that the change is making an improvement? Runand Control Charts They help us turn data into information and allow us to determine if our improvement strategies have had the desired effect.

  24. Elements of a Run Chart The centerline (CL) on a Run Chart is the Median ~ Measure X (CL) Time

  25. Let the Data tell the story- Annotations

  26. TREND

  27. Look at the Relationships

  28. Local Display and Feedback of Data

  29. A Simple Improvement Plan Which process do you want to improve or redesign? (AIM) How do you plan on actually making improvements? What strategies do you plan to follow to make things better? (Change Concepts and Tests of Change) What effect (if any) did your plan have on the process? (measures) Run and Control Charts help you answer Questions 3 YOU need to figure out the answers to Questions 1 & 2.

  30. Key Points To Remember! Measurement to diagnose problems within the service or to define whether a service change has been an improvement? Develop aims before measuring Design measures around aims Track progress over time (Run and Control Charts) Make results visible and feedback to those who collected the information

More Related