1 / 65

The Compelling Display of Data to Achieve Desired Decision Making

The Compelling Display of Data to Achieve Desired Decision Making. Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Assistant Vice President for Safety, Health, Environment & Risk Management The University of Texas Health Science Center at Houston

kynton
Download Presentation

The Compelling Display of Data to Achieve Desired Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Compelling Display of Data to Achieve Desired Decision Making Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Assistant Vice President for Safety, Health, Environment & Risk Management The University of Texas Health Science Center at Houston Associate Professor of Occupational Health The University of Texas School of Public Health

  2. Why Training on Data Presentation ? • An interesting dilemma: • Almost all programs thrive on data • Virtually every important decision is based on data to some extent • Formal training in the area of compelling data presentations is rare for many professionals • The ability to compellingly display data is the key to desired decision making

  3. Why Training on Data Presentation (cont.)? • The safety profession is particularly awash in bad examples of data presentations! • We’ve all endured them at some point in our careers! • Commentary: This may be the reason for repeated encounters with upper management who do not understand what their EH&S programs do.

  4. Evolution of EH&S Measures and Metrics • First step: • ultimate outcomes – OSHA 300 log, inspection non-compliance • Second step: • EH&S activities prior to first order events – injuries and non-compliance

  5. Evolution of EH&S Measures and Metrics (cont.) • Third step: • Relating activities to larger institutional parameters – true metrics • Fourth step: • The compelling display of relationships so that the desired decision by upper management becomes obvious

  6. Achieving EH&S Data Display Excellence • The presentation of complex ideas and concepts in ways that are • Clear • Precise • Efficient • How do we go about achieving this?

  7. Go to The Experts On Information Display • Tukey, JW, Exploratory Data Analysis, Reading, MA 1977 • Tukey, PA, Tukey, JW Summarization: smoothing; supplemented views, in Vic Barnett ed. Interpreting Multivariate Data, Chichester, England, 1982 • Tufte, ER, The Visual Display of Quantitative Information, Cheshire, CT, 2001 • Tufte, ER, Envisioning Information, Cheshire, CT, 1990 • Williams, R The Non-Designers Book: Design and Typographic Principles for the Visual Novice. Berkley, CA, 1994 • Tufte, ER, Visual Explanations, Cheshire, CT, 1997

  8. Recommendations • Don’t blindly rely on the automatic graphic formatting provided by Excel or Powerpoint! • Encourage the eye to compare different data • Representations of numbers should be directly proportional to their numerical quantities • Use clear, detailed, and thorough labeling

  9. Recommendations (cont.) • Display the variation of data, not a variation of design • Maximize the data to ink ratio – put most of the ink to work telling about the data! • When possible, use horizontal graphics: 50% wider than tall is usually best

  10. Compelling Remark by Tufte • “Visual reasoning occurs more effectively when relevant information is shown adjacent in the space within our eye-span” • “This is especially true for statistical data where the fundamental analytical act is to make comparisons” • The key point: “compared to what?”

  11. Four UTHSCH “Make Over” Examples • Data we accumulated and displayed on: • Nuisance Fire Alarms • Workers compensation experience modifiers • First reports of injury • Corridor clearance • But first, 2 quick notes: • The forum to be used: • The “big screen” versus the “small screen”? • In what setting are most important decisions made? • Like fashion, there are likely no right answers – individual tastes apply, but some universal rules will become apparent

  12. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  13. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  14. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  15. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  16. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  17. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  18. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  19. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge (FY04) Fiscal Year 04

  20. Results of the Great UTHSC-H Nuisance Fire Alarm Challenge

  21. UT-Tyler UTMB UT-SA MDA UT-H UT-SW Employee Worker’s Comp Experience Modifiercompared to other UT health components, FY 98-FY 04 Rate of "1" industry average, representing $1 premium per $100

  22. WCI Premium Adjustment for UTS Health Components(discount premium rating as compared to a baseline of 1) UT Health Center Tyler (0.40) UT Medical Branch Galveston (0.38) UT HSC San Antonio (0.27) UT Southwestern Dallas (0.24) UT HSC Houston (0.17) UT MD Anderson Cancer Center (0.14) Fiscal year

  23. Employee Resident Student Losses – PersonnelReported Injuries by Population 694 715 690 675 623 608 511

  24. Number of First Reports of Injury, by Population Type Total (n = 513) Employees (n = 284) Residents (n = 140) Students (n = 89)

  25. MSB Corridor Blockage in Cumulative Occluded Linear Feet, by Month and Floor(building floor indicated at origin of each line) 7th 6th 5th 4th 3rd 2nd 1st G 2004 2005

  26. Important Caveats • Although the techniques displayed here are powerful, there are some downsides to this approach • Time involved to create assemble data and create non-standard graphs may not mesh with work demands • Relentless tinkering and artistic judgment • Suggested sources for regular observations to develop an intuitive feel for the process • Suggested consistent source of good examples: • Wall Street Journal • Suggested consistent source of not-so-good examples: • USA Today “char-toons”

  27. Summary • The ability to display data compellingly is the key to desired decision making • Always anticipate “compared to what?” • Maximize the data-to-ink ratio – e.g. eliminate the unnecessary • Think about what it is you’re trying to say • Show to others unfamiliar with the topic without speaking – does this tell the story we’re trying to tell?

  28. Your Questions at This Point? Now Let’s Look at Some Other Examples

  29. COLLABORATIVE LABORATORY INSPECTION PROGRAM (CLIP) • During October 2005, 80 Principle Investigators for a total of 316 laboratory rooms were inspected • A total of 30 CLIP inspections were performed PI Inspections:

  30. Comprehensive Laboratory Inspection Program (CLIP) Activities and Outcomes, 2005 Month in Number of Principle Inspections Inspections Year 2005 Investigators Inspected Without Violations With Violations May 94 53 (56 %) 41 (44%) June 78 40 (51%) 38 (49%) July 84 54 (64%) 30 (36%) August 74 54 (73%) 20 (27%) September 69 39 (56%) 30 (44%) October 80 50 (62%) 30 (38%)

  31. 2005 Collaborative Laboratory Inspection Program (CLIP) Inspection Activities and Compliance Findings Number without violations Number with violations

  32. 2005 Collaborative Laboratory Inspection Program (CLIP) Inspection Activities and Compliance Findings Number without violations Number with violations

  33. Fig. 3. Receipts of Radioactive Materials Number of non-medical use radioactive material receipts Number of medical use radioactive material receipts

  34. Fig. 3. Receipts of Radioactive Materials Number of non-medical use radioactive material receipts Number of medical use radioactive material receipts

  35. Results of University EH&S Lab Inspection Program, 2003 to 2005 Number of labs existing but not inspected Number of labs inspected and one or more violation detected Number of labs inspected and no violations detected Note: 33 labs added to campus in 2005, increasing total from 269 to 302.

  36. Average Cost of Workers Compensation Claims, by Cause, for Period FY01 - FY06 Slips, trips, falls – inside Cumulative trauma Overextension, twisting Slips, trips, falls – outside Lifting/handling Uncontrolled object Average cost from total of 3 events Average cost from total of 10 events Average cost from total of 4 events Average cost from total of 3 events Average cost from total of 4 events Average cost from total of 4 events

  37. 2005 Total Number of Monthly Workers Compensation Claims inclusive of the three most frequent identifiable classes of injuries Total Fall Strain Cut, Puncture

  38. Number caused by non-needle sharps Number caused by hollow-bore needles Start of Academic Year

  39. Fire Extinguisher Systems Fire Extinguishers Fire Related Incidents Asbestos Projects 1986 0 0 0 0 1996 203 19 91 55 1998 208 25 15 68 2003 437 46 -18 191

  40. Growth in Occupational Safety Responsibilities 1986 to 2003

  41. Growth in Occupational Safety Responsibilities 1986 to 2003

  42. Figure 1: Laboratory Waste verses Total Waste Generated

More Related