250 likes | 396 Views
HCI & Safety Critical Systems. Lynne Hall. Overview. What are safety critical systems Why use software Causation The fallacy of human error Designing a good operator interface Example: Night Order Book. Introduction. Incorporation of computers into potentially dangerous systems
E N D
HCI & Safety Critical Systems Lynne Hall
Overview • What are safety critical systems • Why use software • Causation • The fallacy of human error • Designing a good operator interface • Example: Night Order Book
Introduction • Incorporation of computers into potentially dangerous systems • Use of computers for control functions • Computers now control most safety critical devices • Often replace traditional hardware safety interlocks and protection systems
Safety Critical Systems • Process Supervision and Control • power stations • electricity networks • chemical sector • Health • life support systems • Transport • Aviation / Space • Ground Transport
Tornado F3 cockpit Taken from: http://www.ptvideo.com/videos/Aviation/cockpit.html
Telerobotic System Taken from: http://www.cse.dmu.ac.uk/~arg/tmmi/interface.html
Defence Sector Taken from: http://www.army-technology.com/contractors/computers/orbit/index.html
Control Rooms (ATC) Taken from: http://www.wild-designs.demon.co.uk/ccd.htm
Industrial Processes • Inherently risky • Risk compounded by: • practicalities of plant maintenance • need for incremental improvements to technology infrastructure • Economic loss through downtime • Failure can result in injury or death
Characteristics • Exceptionally complex • Hundreds of thousands of lines of code • multiple pathways • Embedded systems • hidden from user • Opaque • High information overload potential • Dubious position of operator
Some scarey facts • One error in every 50 lines of code • safety critical systems 100,000 + lines • Ariane 5 - missing full stop… • Impossible to test integrity of safety-critical systems until they are put into real world • Impact of failure can be catastrophic: 200,000 people injured in Bhopal
Examples • Ariane 5 • Chernobyl • Challenger • Union Carbide chemical plant (Bhopal) • Three Mile Island • Big One Rollercoaster (Blackpool) • Channel Tunnel Fire • Texaco Oil Refinery
Why not to use software • Automation can result in tedium • De-skilling • Lowered reaction times • Possible paths in software so extensive that they cannot be tested
Why use software • Automate safety critical process • Continual monitoring of process • Give guidance to user in a safety critical process • Provision of advanced warning • Growing complexity of new systems requires the use of software
The Scapegoat - Human Error • 75% of aviation accidents caused by mistake made by one of cabin crew • Inadequate design can place operator in situation where error is inevitable or at least very likely • Contribution operator can make to design of safety critical systems may be undervalued and underused
Why do errors happen • Multi-level model • Failings in social context • management and safety culture • training and awareness • Cognitive level errors in human decision making • training • task design • Design errors at interface • not the user’s fault
“Windows of Opportunity” for Human Error • Failure of human responsibilities • Effect of unexpected hw/sw failure • Dealing with rare events • Level of user knowledge • Cognitive workload • Utility and Usability
Design of good operator interface • System design requires understanding of strengths and weaknesses which humans display under operational conditions • Soft facts can be very important • LIFETRACK project • information that underpins communication • communication structures • stakeholders • training (and not just in-house)
Designing the Operator Interface • Not a last minute task • Not just concerned with superficial factors such as layout and displays • Reaches deep into requirements and design processes • Concerned with what should be automated and how this should be automated (and if..) • Social, psychological and technical issues
IEC 61508 • Function safety of electrical / electronic / programmable electronic safety-related systems • Recognises need for human factors • Standard • Not very explicit • Integrates human factors in development process
Night Order Book • Context: Chemical Plant • Produced daily by technical supervisor • Multiple paper copies distributed to night shift • Allows day shift to inform night shift of important process facts and developments
Why move to computer based • Delivery delays • Data loss and confusion • Clutter • Data access limitations • No or limited access to past knowledge
Operator Requirements • Fast • Uncluttered, consistent, “known,” interface style • Important information readily available in an at-a-glance format • Large buttons • Avoidance of pull-down menus
Operator Requirements 2 • Avoidance of excessive typing • Use of keyboard rather than mouse • Few basic queries should support all requests • Information access should be achieved with minimum number of actions • Authorised input only • Data security
Summary • Safety-critical systems rely on the use of computing hardware and software • Need to include human factors throughout lifecycle of safety-critical systems • HCI for safety-critical systems is essential for appropriate work support • Display and lay out of interface must be rigorously tested and evaluated