1 / 38

DESIGNING PROCESSES TO REDUCE ERROR AND TO FACILITATING PRECISE ACTION DR. JOHN R. GROUT

DESIGNING PROCESSES TO REDUCE ERROR AND TO FACILITATING PRECISE ACTION DR. JOHN R. GROUT. jgrout@berry.edu. Introduction by examples . A picture is worth a thousand words: Examples are a very powerful way of understanding mistake-proofing. Mistake-proofing: a preliminary definition.

river
Download Presentation

DESIGNING PROCESSES TO REDUCE ERROR AND TO FACILITATING PRECISE ACTION DR. JOHN R. GROUT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DESIGNING PROCESSES TO REDUCE ERROR AND TO FACILITATING PRECISE ACTIONDR.JOHN R. GROUT jgrout@berry.edu

  2. Introduction by examples A picture is worth a thousand words: Examples are a very powerful way of understanding mistake-proofing

  3. Mistake-proofing: a preliminary definition • Mistake-proofing is the use of process design features* to facilitate correct actions, prevent simple errors, or mitigate the negative impact of errors. • Mistake-proofing tends to be inexpensive, very effective, and based on simplicity and ingenuity. • It will not make processes free of all errors, nor is it a stand-alone technique that will eliminate the need for any other responses to error. • Something already in use in healthcare, but more could be done. *these process design features will be referred to as “devices” or “counter-measures”

  4. Berwick: “The remedy is in changing systems of work. The remedy is in design.” If you do not remedy the design of healthcare … Why design?

  5. Design …Then the methods of reducing risks and hazards are limited to: • What can be put on paper and subsequently… • What can be embedded in the human brain. “Knowledge in the head”* *Source: Donald Norman, The Design of Everyday Things

  6. Design • Berwick hopes “that normal, human errors can be made irrelevant to outcome, continually found, and skillfully mitigated.” • Can human errors become irrelevant by only changing knowledge in the head? “Knowledge in the World”* *Source: Donald Norman, The Design of Everyday Things

  7. Knowledge in the World vs. Head World: • Provide clues about what to do • Change process design: embed the details in the process • Frees mind to consider the “big picture” • Facilitates “knowledge work” Head: • Alter SOPs • Retrain • Re-certify skills • Manage & enhance attentiveness

  8. To err is human • Have you ever traveled to work and not remembered it? • Have you ever gone home when you meant to stop at a store? Why does that happen? How would you prevent it if your life depended on it?

  9. Corrective action • I polled the Quality newsgroup on the internet. A majority reported at least 20-30% of corrective or remedial actions were “worker reprimanded and retrained.” • The admonition to “be more careful” or “pay attention” are not effective for humans, especially in repetitive environments.

  10. “Be more careful” not effective • “The old way of dealing with human error was to scold people, retrain them, and tell them to be more careful … My view is that you can’t do much to change human nature, and people are going to make mistakes. If you can’t tolerate them ... you should remove the opportunities for error.” • “Training and motivation work best when the physical part of the system is well-designed. If you train people to use poorly designed systems, they’ll be OK for awhile. Eventually, they’ll go back to what they’re used to or what’s easy, instead of what’s safe.” • “You’re not going to become world class through just training, you have to improve the system so that the easy way to do a job is also the safe, right way. The potential for human error can be dramatically reduced.” Chappell, L. 1996. The Pokayoke Solution. Automotive News Insights, (August 5): 24i. LaBar, G. 1996. Can Ergonomics Cure ‘Human Error’? Occupational Hazards 58(4): 48-51.

  11. A new attitude toward preventing errors: “Think of an object’s user as attempting to do a task, getting there by imperfect approximations. Don’t think of the user as making errors; think of the actions as approximations of what is desired.”* These approximations are part of Norman’s concept of “knowledge in the head” *Source: Norman, The design of everyday things. Doubleday 1988.

  12. A New Attitude toward Preventing Errors • Make wrong actions more difficult • Make it possible to reverse actions — to “undo” them—or make it harder to do what cannot be reversed. • Make it easier to discover the errors that occur. • Make incorrect actions correct. These outcomes do not occur without design changes

  13. Precise outcomes without precise knowledge or action? Provide clues about what to do: • natural mappings • affordances • visibility • feedback • constraints

  14. Natural Mappings: Which dial turns on the burner? Stove A Stove B

  15. Affordances: How would you operate these doors? Push or pull? left side or right? How did you know? A C B

  16. “SUPPORT THE BOTTOM” Affordances: How would you lift this pan?

  17. ? Visibility and Feedback • Visibility means making relevant parts visible, and effectively displaying system status • Feedback means providing an immediate and obvious effect for each action taken.

  18. Constraints: How would you assemble these parts?

  19. Logical: Based on making sense of relationships (Example: assembly by the process of elimination) Semantic: Relies on clues about the meaning of the situation (Example: face oriented correctly) Cultural: Adheres to a known convention (Example: front & rear lights) Physical: Shape and size of objects control their relationship (Example: front vs. rear hub) Constraints? Source: The Design of Everyday Things, by D.A. Norman, 1988, Doubleday

  20. Strong Mistake-Proofing -- Design Processes to Fail? “We rely on failure of all kinds being designed into many of the products we use every day, and we have come to depend upon things failing at the right time to protect our health and safety...” “We often thus encourage one mode of failure to obviate a less desirable mode.” In other words, a benign failure (Petroski, 1997, page 415)

  21. Designing Benign Failures Engineering • "Failure is a relative concept, and we encounter it daily in more frequent and broad-ranging ways than is generally realized. And that is a good thing, for certain types of desirable failures [(benign ones)], those designed to happen, are ones that engineers want to succeed at effecting." • (Petroski, 1997, page 416)

  22. Designing Benign Failures Medicine: • “…a process that is designed to detect failure and to interrupt the process flow is preferable to a process that continues on in spite of the failure…We should favor a process that can, by design, respond automatically to a failure by reverting to a predetermined (usually safe) default mode. Note that interruptions are themselves process failures Croteau & Schyve, Proactively Error-Proofing Health Care Processes, in Spath,P.L., Error Reduction in Health Care. Chicago: AHA Press, 2000.

  23. Interdisciplinary Approach to Design Psychology: • Norman recommends designing forcing functions into process: “actions are constrained so that failure at one stage prevents the next step from happening.” “[they] rely upon properties of the physical world for their operation; no special training is necessary”. “Knowledge in the Head” vs. “Knowledge in the World”

  24. Designing Benign Failures Quality Management: • Shingo recommends that “when abnormalities occur, shut down the machines or lock clamps to halt operations thereby preventing the occurrence of serial defects.” • With “the idea of discovering errors in conditions that give rise to defects and performing feedback and action at the error stage so as to keep those errors from turning into defects.”

  25. But How do you Design Failures? • Use your favorite failure analysis method: • Failure mode and effects analysis (FMEA) • Fault trees • Causation chart • And many others

  26. Using failure analysis to design benign failures Use these methods for TWO purposes: • Traditional use: Determine what can happen • Carefully define the current situation, • Determine causes of undesirable failure, and • identify the “resources” required to generate that undesirable failure • New use: Determine ways of creating benign failures, and use them AS the preventive measures • provide insights into desired failures • Identify the “resources” required to generate them.

  27. X Cause #4 P(F4)=.05 Using failure analysis to design benign failures Harmful Event Benign Failure P(harmful event)=.11 P(Benign Failure)= .051 OR OR AND P(C1C2)=.01 AND P(C1C2)= .001 Cause #1 P(F1)=.1 Cause #2 P(F2)=.1 Cause #B P(F2)=.1 Cause #C P(F2)=.1 Cause #A P(F1)=.1 Cause #3 P(F3)=.05 Cause #4 P(F4)=.05 For more details see Grout, “Preventing Medical Errors by Designing Benign Failures.” Joint Commission Journal on Quality and Safety Vol. 29 (2003), No.7, pp. 354-362.

  28. Bumper Switch on Portable X-ray: The bumper is easily pushed in so that the slightest contact causes the portable unit to stop immediately. Learning From Other Industries HEALTH CARE NOT HEALTH CARE • Comparable solutions can be found on automatic guided vehicle in flexible manufacturing systems. The function of the bumper on the AGV is identical to the one on the x-ray machine.

  29. Learning From Other Industries HEALTH CARE • Drive Bar on Portable X-ray: The drive bar is the steering mechanism of the portable x-ray unit. This bar must be depressed in order for the unit to move. If the technologist takes their hands off of this bar the portable immediately stops.

  30. Learning From Other Industries Comparable solutions can be found at the Shang Hai airport Or at home on a snow cone (shaved ice) machine NOT HEALTH CARE

  31. Definitely NOT HEALTH CARE Learning from other industries Would there be a use for this technology in healthcare? • Liquor control systems measure, count, control, and report the amount of liquids poured and compares it to point of sale receipts. • Comes in regular and wireless configurations that are capable of tracking 255 bottles.

  32. HEALTH CARE Learning from other industries Methadone, used to control narcotic addiction, is frequently encountered on the illicit drug market and has been associated with a number of overdose deaths. The pictured device manufactured by GW Pharmaceuticals and being tested at King’s College London, is tamper-proof, administers only the correct dosage, and can monitors compliance.

  33. Learning from other industries • Would there be a use for this technology in healthcare? • Pick-to-light bin system: • indicates correct bin with a light. • Infrared beam detects incorrect withdrawal, sounds alarm. • Capable of Ethernet downloads • Frame to retrofit existing shelving system

  34. Implementation Issues: medical vis-à-vis other industries • Liability/Discoverability/Need for anonymity? • Need careful assessment of down-side risk • Culture of depending on individuals not on systems • Process depends on individuals not on systems: Lack of consistent process • Resource shortages • Medical applications focus more on information counter-measures • Low barriers to diffusion • Need shared examples database

  35. Invite your departments/staff to collect existing examples of mistake-proofing in your facilities Send the examples to me (jgrout@berry.edu) Determine where additional mistake-proofing is needed Look for examples in other industries that are applicable in health care Invite suppliers to include/develop mistake-proof products Invent and perform trials on new mistake-proofing devices Where do I begin? Easier Harder

  36. Examples serve as catalog and catalyst • Catalog • The list of examples that you create will provide solutions which could be implemented elsewhere in your facility • Catalyst • You will develop an awareness to a new approach to patient safety problems • Gosbee & Anderson (VA) suggest such alternative examples improve root cause analysis • Increased “vocabulary” of responses to error

  37. Additional Information • Medical • Preventing Medical Errors by Designing Benign Failures. J. R. Grout, 2003, Joint Commission Journal on Quality and Safety 29(7): 354-362. • www.mistakeproofing.com/medical • Human factors engineering design demonstrations can enlighten your RCA team. J. Gosbee and T. Anderson Quality and Safety in Health Care 2003(12) • General • www.campbell.berry.edu/pokayoke • Make your service fail-safe. Chase, R. B., and D. M. Stewart. 1994. Sloan Management Review (Spring): 35-44. • Make No Mistake!: An Outcome-Based Approach to Mistake-Proofing. Hinckley, C.M. 2001. Portland, Oregon: Productivity Press. • Poka-yoke: Improving product quality by preventing defects. Nikkan Kogyo Shimbun/Factory Magazine, (Ed.). 1988. Portland, Oregon:Productivity Press. • Zero quality control: source inspection and the poka-yoke system. Shingo, Shigeo. 1986. trans. A.P. Dillion. Portland, Oregon: Productivity Press. • The Design of Everyday Things. Norman, D.A. 1989. New York: Doubleday.

  38. Thank You

More Related