1 / 49

Risk Theory

Risk Theory. Matt Hill Consultant Anaesthetist Plymouth Hospitals NHS Trust. Error myths. Error equates to incompetence the more experienced you are the bigger the mistakes you will make. Errors are intrinsically bad Just world hypothesis Errors occur out of the blue. Today.

xiu
Download Presentation

Risk Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Risk Theory Matt Hill Consultant Anaesthetist Plymouth Hospitals NHS Trust

  2. Error myths. • Error equates to incompetence • the more experienced you are the bigger the mistakes you will make. • Errors are intrinsically bad • Just world hypothesis • Errors occur out of the blue

  3. Today • Why is medicine dangerous? • Error classification • How our minds work • Sources of error • The individual • The system • The culture • What we can do about it

  4. Why is medicine dangerous? Perrow 1984

  5. Levels of performance control. Cognitive control modes Conscious Mixed Automatic Situations Routine Skill based Rule based Trained for problems Knowledge based Novel problems

  6. Interacting with long term memory base • Similarity matching and frequency gambling are automatic unconscious and continually operative • What barks has four legs wags its tail cocks its leg at lampposts? • This is similarity matching • Examples of a four legged animal • This is frequency gambling as it is guided by familiarity with the animal

  7. Relationship between memory and attention

  8. Kaniza triangle

  9. Error classification.

  10. Mistake types • Definitions: • Error: planning lapse • Lapse: storage • Slip: action not as planned • Violation: deliberate deviation from normal behaviour

  11. Errors • Not achieving the desired outcome does not necessarily signify a mistake • A plan has two elements • A process • An outcome

  12. Error classification based on action • Omissions • Intrusions: appearance of unintended actions • Repetitions • Wrong objects: correct actions but on wrong object • Mis-orderings: right actions but in wrong order • Mis-timings: right actions but at wrong time • Blends: unintended merging of two action sequences meant to serve different goals

  13. Contextual factors. • Anticipations and perseverations • Errors shaped by what has happened or what is coming up • Priming • Interruptions and distractions • May believe we were further along in task or lead to repetition • Stress

  14. The individual as a source of error. We are free agents capable of choosing between safe and unsafe behaviours

  15. Human as hazard versus human as hero. • Emphasis on the errors and violations that we make, with very little emphasis on the insights, recoveries, adjustments, adaptations, compensations and improvisations performed daily.

  16. The downside of human as hero • Local workarounds • Normalisation of deviance • Doing too much with too little • Forgetting to be afraid

  17. The system as a source of error.

  18. Swiss Cheese Active failures Latent conditions Reason 2000

  19. System problems • Uncoordinated approach • Bolt on • Normalisation of deviance • Doing too much with too little • Arrogance • Forgetting to be afraid

  20. Vulnerable system syndrome • Three pathological entities • Blame • Denial • Pursuit of the wrong kind of excellence

  21. Person and system models: getting the balance right • Personal qualities of diligence and attention to detail matter • Can’t fall prey to learned helplessness – it’s the system • Institutional resilience • Error wisdom on the frontline

  22. Why do people violate safety rules? • I can handle it • I can get away with it • I can’t help it • Everyone does it • It’s what they (the company) really want • They’ll turn a blind eye

  23. Violations • Who is most likely to viloate? • Young men • Having a high opinion of their work skills relative to others • Who may be relatively experienced and not especially error prone • Who are more likely to have a history of incidents and accidents • Who are significantly less constrained by whatother people think and by negative beliefs about outcomes

  24. Mental economics of Violating • Costs and benefits of non-compliance vs compliance • Often easier way of working and brings no obvious bad effects • Benefits are immediate and the costs are seemingly remote and unlikely

  25. Performance control problems • Long-term memory containing mini theories rather than bald facts, leaves us liable to tunnel vision and confirmation bias. • Limited attentional resources, necessary for coherent planned action leave us prey to inattention and information overload.

  26. “We know there are known knowns: there are things we know we know. We also know there are known unknowns: that is to say we know there are things we know we don't know. But there are also unknown unknowns — the ones we don't know we don't know” What did he mean?

  27. Three bucket model • Individual mindfulness Self Context Task Recognise situations which have high error-provoking potential

  28. Rubber band model Compensatory corrections Safe operating zone Danger Stable system Perturbed system Corrected system

  29. Resources Protective resources Safe operating zone Productive resources Stable system Perturbed system Corrected system

  30. Culture as a source of error.

  31. Culture. • Definition: • “those attitudes, assumptions and values which condition the way in which individuals and the organisation work” • Kennedy 2001 • Is it important?

  32. Safety culture • Pathological • Muzzle,malign or marginalise whistle blowers and shirk collective responsibility, punish or cover up failures • Bureaucratic • Generative • Encourage individuals and groups to observe, to inquire, to make their conclusions known and where observations concern important aspects of the system actively bring them to the attention of higher management

  33. HRO’s • Pre-occupation with failure • Reluctance to simplify • Sensitivity to operations • Commitment to resilience • Deference to expertise Weick and Sutcliffe 2001

  34. A prolonged period without an adverse event does not signal safe enough.. ……..correctly seen as a period of heightened danger

  35. Briefings • Pre-operative briefings: • reduce the number of communication failures and promote proactive and collaborative teamwork (Sexton et al 2006, Lingard et al, 2008) • reduce unexpected delays through a decrease in communication problems (Nundy et al, 2008) • ensuring everyone knows each others names facilitates communication and breaks down the hierarchical gradient (Singer at al, 2009) • improve teamwork (Sexton et al, 2006) • reduce peri-operative mortality (Mazzocco et al, 2009)

  36. Communication • Surgeons reporting better communication and collaboration had an association with lower patient mortality rates (Davenport) • Improved communication resulted in better patient outcomes (de Leval, 2000) • Improved teamwork counteracts some of the negative effects of fatigue on performance (Sexton et al, 2006)

  37. Checklists • Reduction in morbidity and mortality (Haynes et al, 2010) • CVC checklist reduced infection rates (Provonost et al, 2008) • Is it the checklist or is it the associated change in culture? • It is alright to challenge someone who is not doing a task correctly

  38. Management • Managers have a more positive view of safety than frontline workers (Singer et al, 2008) • Improved safety climate through safety walkrounds by executives (Thomas et al, 2005) • Perceptions of management inversely correlate with ICU outcomes (Huang et al, 2010) • Decreased sick rate with improved safety climate (Kivimaki et al, 2001)

  39. Domains

  40. Briefings

  41. Domains (% positive)

  42. What can we do?

  43. The greater the danger, the greater is the need for error wisdom. James Reason

  44. Questions?

More Related