1 / 44

Session #20 How to Drive Clinical Improvement That Get Results

Session #20 How to Drive Clinical Improvement That Get Results. Tom Burton. And the Catalyst Academy Education Team. What is a Clinical Program?. Organized around care delivery processes Permanent integrated team of clinical and analytics staff

Download Presentation

Session #20 How to Drive Clinical Improvement That Get Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Session #20How to Drive Clinical Improvement That Get Results Tom Burton • And the Catalyst Academy Education Team

  2. What is a Clinical Program? • Organized around care delivery processes • Permanent integrated team of clinical and analytics staff • Creates a iterative continuous learning environment • Focus is on sustained clinical outcome improvement (not revenue growth) • Not a Clinical Service Line (although you can Leverage Service Lines as a good start)

  3. Organizational AGILE Teams • Permanent teams that meet weekly • Integrated clinical and technical members • Supports multiple care process families = SubjectMatterExpert = DataCapture = DataProvisioning & Visualization Women & Children’s Clinical Program Guidance Team = Data Analysis MD Lead RN SME MD Lead RN SME MD Lead RN SME Guidance Team MD lead RN, Clin Ops Director Pregnancy Normal Newborn Gynecology DataArchitect Knowledge Manager Application Administrator

  4. Incorporating the most effective learning methods Teach Others - 90% Practice by Doing- 75% Discussion Group- 50% Demonstration- 30% Audiovisual- 20% Reading- 10% Lecture- 5% % represents average information retained through the particular learning method • Duke University

  5. Session Objective4 Learning Experiences Clinical Programs that Get Results Principles • Choose the right initiative • Understand variation • Improve data quality • Choose the right influencers

  6. Choose the right initiative

  7. Deal or No Deal Exercise

  8. DEAL or NO DEAL

  9. First Principle • Picking an improvement opportunity randomly is like playing traditional DEAL or NO DEAL • You might get lucky • Choosing the loudest physician or the choosing based on non-data driven reason can dis-engages other MDs and use scarce analytical resources on projects that may not be the best investment • It takes about as much effort to work on a large process as it does on a small process 9

  10. Analytic System Pareto Example: Resources Consumed Key Findings: • 50% of all in-patient resources are represented by 7 Care Process Families • 80% of all in-patient resources are represented by 21 Care Process Families 80% Cumulative % 50% % of Total Resources Consumed for each clinical work process Number of Care Process Families (e.g., ischemic heart disease, pregnancy, bowel disorders, spine, heart failure) 7 CPFs 21 CPFs 10

  11. Analytic System Mean Cost per Case = $20,000 Dr. J. 15 Cases $60,000 Avg. Cost Per Case $35,000 x 25 cases = $875,000 opportunity $40,000 x 15 cases = $600,000 opportunity Total Opportunity = $600,000 Total Opportunity = $1,475,000 Total Opportunity = $2,360,000 Total Opportunity = $3,960,000 Cost Per Case, Vascular Procedures

  12. Improvement Approach - Prioritization High 3 1 # of Cases # of Cases Variability 4 2 Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes # of Cases # of Cases Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes Low Low High Resource Consumption 12

  13. Improvement Approach - Prioritization High 3 1 # of Cases # of Cases Variability 4 2 Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes # of Cases # of Cases Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes Low Low High Resource Consumption 13

  14. Internal Variation versus Resource Consumption 3 1 Y- Axis = Internal Variation in Resources Consumed 4 2 Bubble Size = Resources Consumed Bubble Color = Clinical Domain X Axis = Resources Consumed

  15. DEAL or BETTER DEAL 15

  16. Understand Variation

  17. The Popsicle Bomb Exercise Timer 1M 59 58 57 56 55 54 53 52 51 50 49 48 31 47 45 44 43 42 41 40 39 38 37 36 35 34 33 32 30 14 0 1 28 27 26 25 24 23 22 21 20 19 18 17 46 16 29 13 12 11 10 9 8 7 6 5 4 3 2 15 When you’re finished note your time and enter it in the HAS app – Poll Question 1

  18. Variation in Results • Corp Analytics – shows results 18

  19. Less Effective Approach to improvement:“Punish the Outliers” 1 box = 100 cases in a year Focus on Minimum Standard Metric Mean # of Cases # of Cases Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes • Option 1: “Punish the Outliers” or “Cut Off the Tail” • Strategy • Set a minimum standard of quality • Focus improvement effort on those not meeting the minimum standard • Current Condition • Significant Volume • Significant Variation

  20. Effective Approach to improvement: Focus on “Better Care” 1 box = 100 cases in a year Focus on Best Practice Care Process Model Mean # of Cases # of Cases Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes • Option 2: Identify Best Practice • “Narrow the curve and shift it to the right”Strategy • Identify evidenced based “Shared Baseline” • Focus improvement effort on reducing variation by following the “Shared Baseline” • Often those performing the best make the greatest improvements • Current Condition • Significant Volume • Significant Variation

  21. Round 2 Timer 59 58 57 56 55 54 53 52 51 50 49 48 31 47 45 44 43 42 41 40 39 38 37 36 35 34 33 46 1M 32 30 14 1 28 27 26 25 24 23 22 21 20 19 18 17 16 15 29 13 12 11 10 9 8 7 6 5 4 3 2 0 When you’re finished note your time and enter it in the HAS app – Poll Question 2 21

  22. Reduced Variation in Results • Corp Analytics – shows results 22

  23. Improve Data Quality

  24. The Water Stopper Exercise

  25. = Subject Matter Expert Information Management = Data Capture = Data Provisioning = Data Analysis Fix it Here DATA CAPTURE • Acquire key data elements • Assure data quality • Integrate data capture into operational workflow Knowledge Managers (Data quality, data stewardship and data interpretation) Application Administrators (optimization of source systems) DATA ANALYSIS DATA PROVISIONING • Interpret data • Discover new information in the data (data mining) • Evaluate data quality • Move data from transactional systems into the Data Warehouse • Build visualizations for use by clinicians • Generate external reports (e.g., CMS) Not Here Data Architects(Infrastructure, visualization, analysis, reporting) Not Here 25 25

  26. Data Capture Quality Principles • Accuracy • Does the data match reality? • Example: Operating Room Time Stamps • Timeliness • What is the latency of the data capture? • Example: Billing data delay; end of shift catch-up • Completeness • How often is critical data missing? • Example: HF Ejection Fraction 26

  27. Challenges with Data “Scrubbing” • Analyst time spent on re-working scrubbing routines • Root cause never identified • Early binding vs. late binding – what you consider dirty data may actually be useful for others analyzing process failures. • Using data to punish vs. data to learn – punish strategy promotes hiding the problem so clinicians don’t look bad 27

  28. Choose the right influencers

  29. Paul Revere's ride Exercise

  30. Revere vs. Dawes Paul Revere "Revere knew exactly which doors to pound on during his ride on Brown Beauty that April night. As a result, he awakened key individuals, who then rallied their neighbors to take up arms against the British.” William Dawes "In comparison, Dawes did not know the territory as well as Revere. As he rode through rural Massachusetts on the night of April 18, he simply knocked on random doors. The occupants in most cases simply turned over and went back to sleep." Diffusion of Innovations (Free Press, 2003) by Everett M. Rogers 30

  31. Early adopters.Recruit early adopters to chair improvement and to lead implementation at each site. (key individuals who can rally support) Innovators.Recruit innovators to re-design care delivery processes (like Revere) N = number of individuals in group N = number needed to influence group (but they must be the right individuals) late majority early majority early adopters laggards (never adopters) Innovators The Chasm N * Adapted from Rogers, E. Diffusion of Innovations. New York, NY: 1995. 31

  32. Guidance Team (Prioritizes Innovations) Early Adopters W&N • Meet quarterly to prioritize allocation of technical staff • Approves improvement AIMs • Reviews progress and removes road blocks Innovators = Subject Matter Expert OB Newborn GYN = Data Capture = Data Provisioning & Visualization Innovators W&N = Data Analysis Small Teams (Designs Innovation) • Meet weekly in iteration planning meeting • Build DRAFT processes, metrics, interventions • Present DRAFT work to Broader Teams Innovators OB Broad Teams (Implements Innovation) Early Adopters W&N • Broad RN and MD representation across system • Meet monthly to review, adjust and approve DRAFTs • Lead rollout of new process and measurement Innovators W&N OB Early Adopters W&N

  33. Organizational AGILE Teams • Permanent teams • Integrated clinical and technical members • Supports multiple care process families • Choose innovators and early adopters to lead = SubjectMatterExpert = DataCapture = DataProvisioning & Visualization Women & Children’s Clinical Program Guidance Team = Data Analysis MD Lead RN SME MD Lead RN SME MD Lead RN SME Guidance Team MD lead Early Adopters RN, Clin Ops Director Pregnancy Normal Newborn Gynecology DataArchitect Knowledge Manager Application Administrator Innovators

  34. How to identify innovators and early adopters • Ask • Innovators (inventors) • Who are the top three MDs in our group who are likely to invent a better way to deliver care? • Early Adopters (thought leaders) • When you have a tough case who are the top three MDs you trust and would go to for a consult? • Fingerprinting selection process • Invite innovators to choose identify their top three MD choices from the early adopters to lead the Clinical Program 34

  35. Conclusion – TEACH OTHERS

  36. Timer Teach Others Exercise • Deal or No Deal • Choose the right initiative • Prioritize based on process size and variation • Popsicle Bomb • Understand variation • Measure variation and standardize processes • Water Stopper • Improve data quality • Fix the problem at the source • Paul Revere’s Ride • Choose the right influencers • Identify Innovators and Early adopters to accelerate diffusion of innovation 49 50 51 52 53 54 55 56 57 58 59 31 48 47 1M 45 46 33 34 35 37 38 36 40 41 42 43 39 32 1 44 29 30 18 19 20 21 22 23 24 25 26 27 28 16 15 17 13 14 0 2 3 5 6 4 8 9 10 11 12 7 53 50 51 52 54 1M 56 57 58 59 1 55 32 41 47 45 44 43 42 40 39 38 37 36 35 34 33 48 49 46 14 16 0 21 18 19 20 22 28 24 25 26 27 29 23 31 17 15 13 12 11 9 8 7 6 5 4 3 30 2 10 Take 1 minute and describe the purpose of each exercise to your neighbor, then swap and let them teach you

  37. Exercise Effectiveness Q1 Overall, how effective were the exercises in explaining the principles? • Not effective • Somewhat effective • Moderately effective • Very effective • Extremely effective 37

  38. Exercise Effectiveness Q2 How effective was the Deal or No Deal Exercise at teaching the principle of prioritizing based on process size and variation? • Not effective • Somewhat effective • Moderately effective • Very effective • Extremely effective 38

  39. Exercise Effectiveness Q3 How effective was the Popsicle Bomb Exercise at teaching the principle of understanding variation and standardizing processes? • Not effective • Somewhat effective • Moderately effective • Very effective • Extremely effective 39

  40. Exercise Effectiveness Q4 How effective was the Water Stopper Exercise at teaching the principle of fixing data quality issues at the source? • Not effective • Somewhat effective • Moderately effective • Very effective • Extremely effective 40

  41. Exercise Effectiveness Q5 How effective was the “Paul Revere Ride” exercise at teaching the principle of choosing the right influencers based on their capabilities as innovators and early adopters? • Not effective • Somewhat effective • Moderately effective • Very effective • Extremely effective 41

  42. Exercise Effectiveness Q6 Are you interested in running these same exercises in your organizations? • Yes • No 42

  43. Analytic Insights Questions & Answers A

  44. Session Feedback Survey • On a scale of 1-5, how satisfied were you overall with this session? • Not at all satisfied • Somewhat satisfied • Moderately satisfied • Very satisfied • Extremely satisfied What feedback or suggestions do you have? • On a scale of 1-5, what level of interest would you have for additional, continued learning on this topic (articles, webinars, collaboration, training)? • No interest • Some interest • Moderate interest • Very interested • Extremely interested

More Related