1 / 36

Improving Safety Performance Through Measurement

Improving Safety Performance Through Measurement. Mike Thomas Managing Director Prof. Andrew Hale Director IOSH London Health& Safety Group 17 March 2008. Overview. Key questions

spike
Download Presentation

Improving Safety Performance Through Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Safety Performance Through Measurement Mike ThomasManaging DirectorProf. Andrew HaleDirector IOSH London Health& Safety Group 17 March 2008

  2. Overview • Key questions • What is the role of performance measurement in a Safety Management System (SMS)? (“What gets measured gets done”, Drucker and HSE in HSG65) • How can measurement be used to improve performance and drive continual improvement? • How can we select appropriate key performance indicators (KPIs)? • How can we develop meaningful targets? • How can we promote performance measurement by line managers and link this to audits?

  3. A Success Story: Corus NL Maintenance department of an integrated steel works, employing about 1100 used an intervention driven by KPIs to improve performance. • Improvement driven by KPI’s for managers: • Compulsory + free choice, ‘proactive’ + ‘reactive’. • Targets/KPI + score (1-10 with 6 for hitting target) • KPIs given different weights & totalled for overall score/period • Reported on a ‘dashboard’ every 6 weeks STOP-GO cards for workforce – dynamic task risk assessment

  4. Key Performance Indicators Output: LTI frequency + Recordable frequency Compulsory: • Reporting of dangerous situations (0.5pppy) • Dealing with dangerous situation reports (80%) • Toolbox meetings (including safety) (80%) Choice (at least 1 of 4): • Behaviour observation rounds (80%) • Safety communication rounds (80%) • ‘5 S’ housekeeping inspections (80%) • Risk assessment + plan of action (80%)

  5. Continual improvement OH&S Policy Management Review Planning Implementationand operation Checking and corrective action OHSAS 18001

  6. Measurement • Within 18001 • “Checking and corrective action” • Includes • Measurement of proactive data • Measurement of reactive data • Internal audit (independent) • Reactive and proactive are key issues

  7. Risk assessment & control Review & improvement Risk management system Audit Evaluation Direct risk controls: Hard-, soft- & liveware Inspection Known risk Incident analysis Comparison Risk assessment Primary processes Unknown risk Review RIE system

  8. Types of indicator • Damage, injury, loss • Precursors – for each scenario, different precursors (incidents, leaks, breakdowns)

  9. Choice and design of (sub)system Choice & design of prevention & control measures For each hazard, different deviation scenarios, with different control measures, failing in different ways Elimination of hazard Re-design Normal situation with in-built hazards Learning Hazard control measures Recovery Deviations from normal situation Detection & recovery Reporting Loss of control (release of energy + exposure Escape Transmission Accident Deviation Model Secondary safety Damage process Rescue, damage limitation, treatment Stabilisation

  10. LTI’s don’t predict disasters • The lesson of Texas City: • BP became distracted from efforts to control process safety because they were lulled into a false sense of security by their very excellent control of lost time injuries. • The fallacy of Bird’s interpretation of Heinrich’s triangle

  11. Disabling Injury 1 Minor Injury 29 300 No injury Heinrich’s thesis • Reasoning from top down • Each scenario or accident type apart • Average ratio over scenarios

  12. Disabling Injury 1 Minor Injury 100 500 Property damage (Bird 1966) From Bird onwards • Overall statistics of a department, factory, or country • Different categories of severity

  13. Types of indicator • Damage, injury, loss • Precursors – for each scenario, different precursors (incidents, leaks, breakdowns) • Presence & readiness of risk control measures (hardware, software, people’s behaviour) • Management system processes to deliver the preventive measures

  14. - Competence of staff- Availability of manpower- Commitment to safe operations & conflict resolution- Communication within and between teams- Procedures, goals & rules- Hardware & software - Selection & training- Manpower planning- Incentives, supervision, appraisal, culture, management priority- Handover, briefing, communication channels- Task/policy analysis & design- Design, layout,maintenance Resources and controls in the SMS These processes are the ones to be audited

  15. Problems with Auditing • Seen as a panacea • “Bolted on” – imposed from outside • May not match policy & own management system: independent of management • Passive process for manager • “Economical with the truth” • Wait for next audit • No ownership

  16. Types of indicator • Damage, injury, loss • Precursors – for each scenario, different precursors (incidents, leaks, breakdowns) • Presence & condition of risk control measures (hardware, software, people’s behaviour) • Management system processes to deliver the preventive measures • Attitudes and values (culture) to use the SMS processes and the risk control measures

  17. Safety climate/culture measurement • Existing climate questionnaires: • limited validation (TRIPOD is one exception) • Interpretation is still more an art than a science – lack of clarity over underlying models of culture • Gap between measuring (profile) and improving • Safety culture maturity scales (Hearts & Minds): • Similar criticisms • Individual completion followed by group discussion generates much useful debate

  18. Improving Performance • Once you are clear on meaning & objectives • Look at measurement and improvement • Three key issues • How to define performance in measurable terms? • How can performance be measured? • How to extend to target setting and to drive improved performance?

  19. How is Improvement Achieved? • “What gets measured gets done” • Apply sound management techniques to safety • Performance standards - what people must do • Who is responsible? – whose KPI? • What are they responsible for? – tasks, processes • When should the work be done? - plans • What is the expected result? – (intermediate) outputs • Set targets and measure performance in terms meaningful to each individual

  20. Criteria for performance measures • Validity • Reliability • Representativeness • Sensitivity • Openness to bias • Cost-effectiveness NB. Exposure measures

  21. Typical targets in the past • “eliminate lost time (or all) accidents” • “reduce all accidents to n or by x% per annum” • “reduce number of days lost to n or by x%” • “reduce cost of claims or other losses by y%” • “eliminate notices as a result of enforcement action” • Obvious approach = what we want! Commendable, but typically, reactive

  22. Validity Reliability Representativeness Sensitivity Reporting bias Cost-effectiveness PM Exposure measure Very high High, but absence  serious Good if enough Low for good organisation Strong for minor High cost of analysis /man or /manhour Accidents as performance measure

  23. Problems with Reactive Targets • Problems - well recognised? • Not helpful to managers! • They accept the value and sense but … • What must they do? • “Promotes” under-reporting and “manipulation” of outcome • What if zero accidents already? • Past performance may be a poor predictor of the future

  24. Accident Rates • "A low accident or incident rate over a period of years is no guarantee that risks are being effectively controlled.This is particularly so in organisations involved in major hazard activities, where the probability of an accident may be low but where the consequences could be extremely serious.In this type of organisation, the historical incidence of reported accidents alone can be an unreliable indicator of safety or environmental performance and can lead to complacency." • Guidance on COMAH Regulations, UK

  25. Reactive Targets • Why still set? • Easy option that sounds good! • Mirrors UK Government’s targets • Accident reduction IS the desired output of the SMS • Are Reactive “Targets” Still Valid? • At a high level within the organisation as “aim” or “vision” • But NOT as targets for managers • Need to move to better data: step 1 • Minor injuries and health effects • Near misses (surprises) or dangerous situations

  26. Way Forward? • Aim • Reduce accidents but numbers of accidents are NOT used as exclusive targets for managers • Move to proactive measures • Set targets on proactive measures of performance • Compare with quality management

  27. Contrast: Safety and Quality • ISO approach to quality as: • An essential feature • Not an optional extra • Emphasise: • managing quality in, notinspecting defects out • get the management processes right, • but still measure defects • Apply this philosophy to safety

  28. What is Proactive Measurement? • Traditional measures (input measures): e.g. • PPE being used • Guards in place • Documents reviewed and updated • All employees trained • Toolbox meetings held • Maintenance conducted to plan • More dynamic measures (input + intermediate) • Hazards identified and put right • Competence tested & used • Behaviour observed, discussed & improved

  29. Planning and Target Setting • If you accept the philosophy, how can measures be translated into targets? • BS8800 approach • Annex C(BS 8800:1996) • Simplified version (BS 8800:2004, Annex D) • Planning and Implementing

  30. Aim Objective Define Outcome Indicator Measure Baseline Data Devise Plan (Management Programme) Topics withQuestions as Targets IMPLEMENT PLAN Planning for Safety

  31. Measuring Success IMPLEMENT PLAN Measure Outcome Data Measure Compliance with Programme (Targets) Achieving Objective ? No Meeting Targets ? Yes Yes Take Corrective Action as Necessary Continue Process

  32. Combining Data

  33. Why questions posed in plan? • Forces clear logical thought • Define the programme • Define the targets • Questions are a blue print for the managers • Questions can be used to measure success • “Correct” answer is “yes” • Link to audit (e.g. CHASE)

  34. Proactive Monitoring and Audit • HASTAM’s CHASE system • Questions developed by planning process • Used by managers to measure performance • Used by specialists for audit • Audit by verification of • Managers’ answers • Recommended remedial action • Reinforces ownership by managers

  35. Conclusion • Explained • Need for and effective means of setting realistic targets • Targets NOT based on “reactive” measures • Based on proactive measures of system • However, no conflict with reactive aims • Targets provide • Clear implementation mechanism for managers • Effective measurement of improved performance

  36. Contact • Andrew HaleHastam LtdThe Old BakehouseMaldonEssex CM9 4LE • 01621-851756andrew.hale@hastam.co.uk

More Related