1 / 43

Rapid Cycle Evaluation of Improvement Initiatives

April 11, 2014 1:00 – 2:00pm. Rapid Cycle Evaluation of Improvement Initiatives. Gareth Parry, Senior Scientist, IHI Amy Reid, Research Associate, IHI. Conflicts of Interest. Who has paid us to give talks Nothing to declare Who has paid us for advice Nothing to declare

oma
Download Presentation

Rapid Cycle Evaluation of Improvement Initiatives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. April 11, 2014 1:00 – 2:00pm Rapid Cycle Evaluation of Improvement Initiatives Gareth Parry, Senior Scientist, IHI Amy Reid, Research Associate, IHI

  2. Conflictsof Interest • Who has paid us to give talks • Nothing to declare • Who has paid us for advice • Nothing to declare • Who has funded our research • Nothing to declare • Who has paid for us to attend conferences • Nothing to declare • Any other interests that could be connected with our work • Nothing to declare

  3. Introduction 1:00 – 1:10

  4. Agenda • Introduction (10 minutes) • Presentation (30 minutes) • Discussion (15 minutes) • Closing remarks (5 minutes)

  5. Objectives • Understand the importance of applying rapid-cycle formative evaluation approaches • Describe evaluation designs applicable to improvement projects at the innovation, testing or spread & scale up phases. • Identify and describe realistic aims for your QI projects • Describe the program theory of an improvement project

  6. Presentation 1:10 – 1:40

  7. Hiraeth • A Welsh word that has no direct English translation. • Homesickness tinged with grief or sadness over the lost or departed, a mix of longing, yearning, nostalgia and wistfulness. • Some concepts can be expressed in a single word in one language while others require a detailed description. • Some things are just lost in translation • What do we mean by: • Quality Improvement • Evaluation

  8. Penicillin The Nobel Prize for Medicine was awarded in 1945 to:

  9. Penicillin 1928 • From a single petri dish he had discarded, Alexander Fleming identified a mold and named it Penicillum. 1930s • Fleming tried with limited success to produce and test the effects of Penicillin as a surface antiseptic. 1939 • Ernst Chain took an interest in Fleming’s work and with Howard Florey identified the active ingredient and produced it in larger quantities. • Tested it by injecting Penicillin into two infected mice – who recovered.

  10. Penicillin 1940s • With Norman Heatley, produced sufficient quantities to test penicillin in fifty mice. • Tested it in three people who were dying from bacterial infections. • Subsequent successful clinical trials led to funding to mass produce Penicillin. 1945 • Sir Alexander Fleming, Ernst Chain & Sir Howard Florey awarded Nobel Prize for Medicine

  11. Penicillin • Sir Henry Harris said at the Florey Centenary lecture in 1998: • "Without Fleming, no Chain; without Chain, no Florey; without Florey, no Heatley; without Heatley, no penicillin.“ • Or: • Without Fleming, no innovation; without Chain and Florey, no testing, without Heatley, no wide scale use of penicillin

  12. Degree of belief Evidence Act

  13. Degree of Belief in Change Ideas High Moderate Low degree of belief Innovation Phase (set design targets, develop Ideas and predictions, and draft an initial conceptual model and change package) Adapt and Spread (implement and disseminate a successful change package) Pilot Phase (test and revise/amend conceptual model and change package)

  14. Bayesian Approaches? Lilford RJ, Braunholtz D. “Who’s afraid of Thomas Bayes?” 2000; J Epidemiol Community Health • Lilfordand Braunholtz argue that a theory is needed that links evidence and decision making. • They argue that Bayesian approaches provide the basis of this theory: • “…a decision maker wants to know the probability of an outcome (for example, the new treatment has lower mortality) given the data…” • A Bayesian approach. • “… not the probability of the data given an outcome (usually, that there is no difference between treatments).” • – A conventional Frequentist approach. • Bayesian approaches combine prior ‘degree of belief’ with new data, to calculate an updated ‘degree of belief’ in the form of a posterior probability distribution. • In other words, an existing probability of an outcome in a particular setting can be updated in the light of new data.

  15. Perla RJ, Provost LP, Parry GJ. Seven Propositions of the Science of Improvement: Exploring Foundations. Quality Management in Health Care. 2013; 22:170–186

  16. Aspirational Aims v. Goals • “Aspirational” aims for project motivation need to be distinguished from realistic goals achievable in the timescale of a project • Example: • Safer Patient Initiative: • (5) Benning et al. BMJ 2011: 14% reduction odds of ICU mortality (p=0.250)Study powered to detect a big difference • Keystone: • (6) Lipitz-Snydermanet al. BMJ 2011: 11% reduction in odds of ICU mortality (p=0.033) • Study powered to detect a smaller but still clinically meaningful difference (7) Nanji et al. Overarching goals: a strategy for improving healthcare quality and safety? BMJ QualSaf 2012 (8) Lilford et al. Evaluating policy and service interventions: framework to guide selection and interpretation of study end points. BMJ 2010

  17. Why Many Improvement Initiatives Are Found to “Fail”

  18. Peter H. Rossi: The “Iron Law” of Evaluation Peter H. Rossi (1987) The iron law of evaluation and other metallic rules. Research in Social Problems and Public Policy, 4: 3-20. “The expected value of any net impact assessment of any social program is zero. This means that our best a priori estimate of a net impact assessment of a program is that it will have no effect.”

  19. Mark W. Friedberg et al. February 25, 2014 Conclusions:  A multipayer medical home pilot, in which participating practices adopted new structural capabilities and received NCQA certification, was associated with limited improvements in quality and was not associated with reductions in utilization of hospital, emergency department, or ambulatory care services or total costs over 3 years. These findings suggest that medical home interventions may need further refinement.

  20. The New England Journal of Medicine Introduction of Surgical Safety Checklists in Ontario, Canada David R.Urbach et al. March 13, 2014 Conclusions:Implementation of surgical safety checklists in Ontario, Canada, was not associated with a significant reductions in operative mortality or complications.

  21. Innovation to Prototyping: Small Number of Settings Applied in a narrow range of contexts Improvement in 100% of sites

  22. Initial Testing:Small Number of Settings Applied in a wider range of contexts Improvement in 80% of sites

  23. More Settings as Range of Contexts Begins to Expand Applied in a wider range of contexts Improvement in 70% of sites

  24. Wide Range of Contexts Applied in a wide range of contexts Improvement in 50% of sites

  25. Reduction in Effectiveness from Applying the Same Fixed-Protocol Program in Different Contexts Innovation sample

  26. Reduction in Effectiveness from Applying the Same Fixed-Protocol Program in Different Contexts Innovation sample Evaluation sample Immediate wide-scale implementation

  27. Effectiveness May Be Maintained If We Can Learn in What Contexts the Protocol Can Be Amended to Work Innovation sample Learn which contexts it can be amended to work in as we move from Innovation to Prototype to Test and Spread

  28. Donald T. Campbell Donald T. Campbell (1969) Reforms as experiments. American Psychologist, 24(4): 409-29. “The United States and other modern nations should be ready for an experimental approach to social reform…in which we learn whether or not these programs are effective, and in which we retain, imitate, modify or discard them on the basis of their apparent effectiveness on the multiple imperfect criteria available.”

  29. Core Concepts & Detailed Tasks Detailed Tasks and Local Adaptations Core Concepts • Use a reliable method to identify deteriorating patients in real time. MEWS >=5 MEWS >=4 2 Nurses 1 Physician • When a patient is deteriorating, provide the most appropriate assessment and care as soon as possible 1 Nurse 1 Physician 1 Physician Theory Action

  30. Core Concepts & Detailed Tasks • Core Concept (Theory) • Describes the underlying principles that underpin a new model or intervention. • Detailed Tasks (Action) • Describes the detailed actions and activities, associated with a core concept, that are applied at a local setting.

  31. What Type of Evaluation? Summative Formative High Moderate Low degree of belief Innovation Phase (set design targets, develop Ideas and predictions, and draft an initial conceptual model and change package) Adapt and Spread (implement and disseminate a successful change package) Pilot Phase (test and revise/amend conceptual model and change package)

  32. Generating the pressure (will) for ICUs to take part • A networked community • Re-framing BSIs as a social problem • Approaches that shaped a culture of commitment • Use of data as a disciplinary force • Hard edges Milbank Quarterly 2011

  33. What are we learning? The Kirkpatrick Evaluation of Learning Framework has four levels: • What was the participants’ experience? • Did the participants have an excellent experience working on the improvement project? • What did the participants learn? • Did they learn improvement methods and begin testing? • Did they modify their behavior? • Did they work differently and see change in their process measures? • Did the organization improve their performance? • Did they improve their outcomes?

  34. Content Theory Change Concepts, Improvement Drivers LEVEL 3 Organizational, Patient-level Outcomes LEVEL 4 What changes will teams make that will result in improvement? Explains how we predict that the change concepts and improvement drivers applied in the project will lead to improved outcomes.

  35. Execution Theory Participant Experience Level 1 IHI Activities Process/Behavior Changes Level 3 External Partner Activities Learning Level 2 What will you do that will lead teams to adopt the process changes? Explains what IHI and other partners are doing that will lead front-line teams to adopt the changes (improvement drivers) described in the content theory.

  36. New Conceptual Model What’s the new theory? Conceptual ModelWhat’s the theory?

  37. An Approach to Evaluation • Assumptions from a Science of Improvement Perspective: • Learnwhat is takes to bring about improvement. • Improvement requires social change and that people are more likely to act if they believe. • Interventions need to be amended to local settings (contexts). • Concepts rather than fixed protocols are a good starting point for people to test and learn whether improvement interventions can be amended to their setting.

  38. Discussion 1:40 – 1:55pm

  39. With the person next to you… • What are your initial thoughts about this type of approach? • How would you implement this approach? • What major challenges would you foresee?

  40. Closing Remarks 1:55 – 2:00 pm

  41. Generalizable Knowledge Site Implementation Test and Refine Outcomes Improved? Amendment 1 Amendment 2 Amendment 3 Amendment 4 A. Contextual Adaptation Yes = Conceptual Model Amendment 1 Amendment 2 Amendment 3 Amendment 4 B. Contextual Adaptation Yes = Local Experience Broad Concepts Amendment 1 Amendment 2 Amendment 3 Amendment 4 C. ContextualAdaptation No = Etc. Prior Research and Experience Specific to Context Generalizable Ovretveit, Leviton, Parry, BMJ QualSaf 2011

  42. Perla RJ, Provost LP, Parry GJ. Seven Propositions of the Science of Improvement: Exploring Foundations. • Quality Management in Health Care. 2013; 22:170–186

  43. An Approach to Improvement The “Science of Improvement “ is an applied science, with philosophical underpinnings. It is not a specific intervention. Systems Epistemology Variation Psychology • Assumptions from a Science of Improvement Perspective: • Learnwhat is takes to bring about improvement. • Interventions need to be amended to local settings (contexts). • Concepts rather than fixed protocols are a good starting point for people to test and learn whether improvement interventions can be amended to their setting. • Provide a prediction, or degree of belief a model will achieve a particular impact in a specific setting. • Improvement requires social change and that people are more likely to act if they believe.

More Related