1 / 25

HCI & Safety Critical Systems

HCI & Safety Critical Systems. Lynne Hall. Overview. What are safety critical systems Why use software Causation The fallacy of human error Designing a good operator interface Example: Night Order Book. Introduction. Incorporation of computers into potentially dangerous systems

woods
Download Presentation

HCI & Safety Critical Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HCI & Safety Critical Systems Lynne Hall

  2. Overview • What are safety critical systems • Why use software • Causation • The fallacy of human error • Designing a good operator interface • Example: Night Order Book

  3. Introduction • Incorporation of computers into potentially dangerous systems • Use of computers for control functions • Computers now control most safety critical devices • Often replace traditional hardware safety interlocks and protection systems

  4. Safety Critical Systems • Process Supervision and Control • power stations • electricity networks • chemical sector • Health • life support systems • Transport • Aviation / Space • Ground Transport

  5. Tornado F3 cockpit Taken from: http://www.ptvideo.com/videos/Aviation/cockpit.html

  6. Telerobotic System Taken from: http://www.cse.dmu.ac.uk/~arg/tmmi/interface.html

  7. Defence Sector Taken from: http://www.army-technology.com/contractors/computers/orbit/index.html

  8. Control Rooms (ATC) Taken from: http://www.wild-designs.demon.co.uk/ccd.htm

  9. Industrial Processes • Inherently risky • Risk compounded by: • practicalities of plant maintenance • need for incremental improvements to technology infrastructure • Economic loss through downtime • Failure can result in injury or death

  10. Characteristics • Exceptionally complex • Hundreds of thousands of lines of code • multiple pathways • Embedded systems • hidden from user • Opaque • High information overload potential • Dubious position of operator

  11. Some scarey facts • One error in every 50 lines of code • safety critical systems 100,000 + lines • Ariane 5 - missing full stop… • Impossible to test integrity of safety-critical systems until they are put into real world • Impact of failure can be catastrophic: 200,000 people injured in Bhopal

  12. Examples • Ariane 5 • Chernobyl • Challenger • Union Carbide chemical plant (Bhopal) • Three Mile Island • Big One Rollercoaster (Blackpool) • Channel Tunnel Fire • Texaco Oil Refinery

  13. Why not to use software • Automation can result in tedium • De-skilling • Lowered reaction times • Possible paths in software so extensive that they cannot be tested

  14. Why use software • Automate safety critical process • Continual monitoring of process • Give guidance to user in a safety critical process • Provision of advanced warning • Growing complexity of new systems requires the use of software

  15. The Scapegoat - Human Error • 75% of aviation accidents caused by mistake made by one of cabin crew • Inadequate design can place operator in situation where error is inevitable or at least very likely • Contribution operator can make to design of safety critical systems may be undervalued and underused

  16. Why do errors happen • Multi-level model • Failings in social context • management and safety culture • training and awareness • Cognitive level errors in human decision making • training • task design • Design errors at interface • not the user’s fault

  17. “Windows of Opportunity” for Human Error • Failure of human responsibilities • Effect of unexpected hw/sw failure • Dealing with rare events • Level of user knowledge • Cognitive workload • Utility and Usability

  18. Design of good operator interface • System design requires understanding of strengths and weaknesses which humans display under operational conditions • Soft facts can be very important • LIFETRACK project • information that underpins communication • communication structures • stakeholders • training (and not just in-house)

  19. Designing the Operator Interface • Not a last minute task • Not just concerned with superficial factors such as layout and displays • Reaches deep into requirements and design processes • Concerned with what should be automated and how this should be automated (and if..) • Social, psychological and technical issues

  20. IEC 61508 • Function safety of electrical / electronic / programmable electronic safety-related systems • Recognises need for human factors • Standard • Not very explicit • Integrates human factors in development process

  21. Night Order Book • Context: Chemical Plant • Produced daily by technical supervisor • Multiple paper copies distributed to night shift • Allows day shift to inform night shift of important process facts and developments

  22. Why move to computer based • Delivery delays • Data loss and confusion • Clutter • Data access limitations • No or limited access to past knowledge

  23. Operator Requirements • Fast • Uncluttered, consistent, “known,” interface style • Important information readily available in an at-a-glance format • Large buttons • Avoidance of pull-down menus

  24. Operator Requirements 2 • Avoidance of excessive typing • Use of keyboard rather than mouse • Few basic queries should support all requests • Information access should be achieved with minimum number of actions • Authorised input only • Data security

  25. Summary • Safety-critical systems rely on the use of computing hardware and software • Need to include human factors throughout lifecycle of safety-critical systems • HCI for safety-critical systems is essential for appropriate work support • Display and lay out of interface must be rigorously tested and evaluated

More Related