1 / 39

Introduction to Human Computer Interaction

Introduction to Human Computer Interaction. Process Control Control of large scale complex processes in which men and/or machines process some commodity.

tobias
Download Presentation

Introduction to Human Computer Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Human Computer Interaction • Process Control • Control of large scale complex processes in which men and/or machines process some commodity. • For example, heating up the temperature in a room is a process that has the specific, desired outcome to reach and maintain a defined temperature, kept constant over time. • Primary task of the human operator • Receive information concerning the process under control • Make decisions concerning the states of the parameters under control • Communicate parameter requirements through hardware to the system process • Detect when specific parameters are beyond tolerable limits, and diagnose/correct the malfunction in the system responsible for the problem

  2. H S L E L Introduction to Human Computer Interaction S: Software – the rules, procedures, and programs for describing how the system should work H: Hardware – the fabricated components of which the system is built E: Environment – the physical, economic, and social background in which the system functions L: Liveware – the human beings which contribute to the operation of the system • SHEL Model L-H interface: human/machine interface L-S interface: the fitting together of humans with the operating procedures of the system L-E interface: the interaction between environmental and organizational aspects of the system L-L interface: the interaction between crews, small groups, teams

  3. Introduction to Human Computer Interaction • Supervisory Control • The role of the operator is monitoring (resulting from increased automation) • The system requires only occasional fine tuning of system parameters to perform adequately • The operator must alter the input or the control routines and must serve as a back-up in the event of a system failure • Intervention with the system operation occurs infrequently and at unpredictable times • Duration of intervention is generally very short (minutes or seconds) • The values and costs associated with the operator’s decisions and actions can be very great • Good performance requires rapid assimilation of large amounts of information and involves complex interfaces

  4. Introduction to Human Computer Interaction • Sheridan’s Model of Supervisory Control – Planning Mode PlanningMode Display Computer Process Controls • Self-paced and “off-line” • Anticipation of a response to a future event • What sensors to use and how to process/display the information for the sensors • What alternative response sequence will produce desired changes in the process • What the relative worth of these changes will be given the current state of the process • Teaching the user

  5. Introduction to Human Computer Interaction • Sheridan’s Model of Supervisory Control – Monitoring Mode MonitoringMode Display Computer Process Controls • Allocation of attention by the operator among the different displays • Determine if all the procedures are functioning properly • If a problem exists in the loop, the operator must diagnose the problem

  6. Introduction to Human Computer Interaction • Sheridan’s Model of Supervisory Control – Intervening Mode InterveningMode Display Computer Process Controls • Operator interrupts the “monitoring” loop • Operator begins direct control of the process • Occurs under emergency conditions • Also occurs under routine maintenance or repair http://www.youtube.com/watch?v=GfgtFbJ25lM http://www.youtube.com/watch?v=GfgtFbJ25lM

  7. Introduction to Human Computer Interaction • Degrees of Automation • The computer offers no assistance: the human must do it all • The computer suggests alternative ways to do the task • The computer suggests one way to do the task, and • … executes the suggestion if the human approves • … allows the human a restricted time to veto before automatic execution • … executes automatically, then necessarily informs the human • … executes automatically, and informs the human only if asked • The computer selects and executes the task, ignoring the human http://www.youtube.com/watch?v=X0I5DHOETFE

  8. Introduction to Human Computer Interaction • Andre’s Taxonomy of Software Automation • Remembers and recalls information for users based on previous actions • E.g., browser history of websites visited • Completes all or part of user’s input • E.g., Auto-complete in search fields • Selects format based on preceding actions • E.g., Phone number format • Makes decisions based on time lapse • E.g., auto logout during on-line banking • Adds new programs based on new hardware installation • E.g., XP plug and play for new printer, etc. • Initiates software based on files being added • E.g., Virus scanning when email attachments added

  9. Introduction to Human Computer Interaction • Ten considerations of human-centered automation • Allocate to the human the tasks best suited to the human, allocate to the automation the tasks suited to it • Make the operator a supervisor of subordinate automatic control systems • Keep the human in the decision and control loop • Maintain the human operator as the final authority over the automation • Make the human operator’s job easier, more enjoyable, or more satisfying through friendly automation • Empower the human operator to the greatest extent possible through flexibility of interface or through automation • Support trust by the human operator • Give the operator information about everything he or she should want to know • Engineer the automation to minimize human error and response variability • Achieve the best combination of human and automatic control, where best is defined by explicit objectives.

  10. Introduction to Human Computer Interaction • Grice maxims of social etiquette (applied to computers/automation?) • Maxim of quantity: say what serves the present purpose but not more • Maxim of quality: say what you know to be true based on sufficient evidence • Maxim of relation: be relevant to advance the current conversation • Maxim of manner: avoid obscurity of expression, wordiness, ambiguity and disorder

  11. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  12. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  13. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  14. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  15. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  16. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  17. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  18. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior

  19. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior Braking with ABS STOP sign Step on brake

  20. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior ABS Off ABS not working How to brake? 50% of braking power ABS signal & STOP sign Step on brake

  21. Introduction to Human Computer Interaction • Rasmussen Hierarchy of Behavior Driving with failed ABS Failed pump Continue to drive Assess stopping distance ABS signal How to brake? 50% of braking power STOP sign Step on brake

  22. Unintended Action Intended Action Slip Lapse Mistake Violation Introduction to Human Computer Interaction • Reason’s Human Error Taxonomy Attentional Failures Intrusion Omission Reversal Mis-ordering Mis-timing Basic Error Types Memory Failures Omitting Planned Items Placed-losing Forgetting intentions Unsafe Acts Rule-Based Mistakes Misapplication of good rule Application of bad rule Routing Violations Exceptional Violation Acts of Sabotage

  23. Unintended Action Intended Action Slip Lapse Mistake Violation Introduction to Human Computer Interaction • Reason’s Human Error Taxonomy Attentional Failures Intrusion Omission Reversal Mis-ordering Mis-timing Basic Error Types Memory Failures Omitting Planned Items Placed-losing Forgetting intentions Unsafe Acts Rule-Based Mistakes Misapplication of good rule Application of bad rule Routing Violations Exceptional Violation Acts of Sabotage

  24. Unintended Action Intended Action Slip Lapse Mistake Violation Introduction to Human Computer Interaction • Reason’s Human Error Taxonomy Attentional Failures Intrusion Omission Reversal Mis-ordering Mis-timing Basic Error Types Memory Failures Omitting Planned Items Placed-losing Forgetting intentions Unsafe Acts Rule-Based Mistakes Misapplication of good rule Application of bad rule Routing Violations Exceptional Violation Acts of Sabotage

  25. Unintended Action Intended Action Slip Lapse Mistake Violation Introduction to Human Computer Interaction • Reason’s Human Error Taxonomy Attentional Failures Intrusion Omission Reversal Mis-ordering Mis-timing Basic Error Types Memory Failures Omitting Planned Items Placed-losing Forgetting intentions Unsafe Acts Rule-Based Mistakes Misapplication of good rule Application of bad rule Routing Violations Exceptional Violation Acts of Sabotage

  26. Unintended Action Intended Action Slip Lapse Mistake Violation Introduction to Human Computer Interaction • Reason’s Human Error Taxonomy Attentional Failures Intrusion Omission Reversal Mis-ordering Mis-timing Basic Error Types Memory Failures Omitting Planned Items Placed-losing Forgetting intentions Unsafe Acts Rule-Based Mistakes Misapplication of good rule Application of bad rule Routing Violations Acts of Sabotage

  27. Introduction to Human Computer Interaction • Unintended Intrusion • The hangar was rigged with light/heat sensitive gear designed to detect fires and then flood the hanger with foam if triggered • The system was armed each night after everyone left for the evening • The janitor decided to take a photo one night

  28. Introduction to Human Computer Interaction • Task analysis (example table)

  29. Introduction to Human Computer Interaction http://www.youtube.com/watch?v=eLPAigMuBk0&feature=related • Case Study: Three Mile Island The accident at the Three Mile Island Unit 2 (TMI-2) nuclear power plant in Pennsylvania on March 28, 1979 was one of the most serious in the history of the U.S. nuclear industry. It not only brought to light the hazards associated with nuclear power, but also forced the industry to take a closer look at the operating procedures used at the time. What makes the TMI-2 accident such an interesting case study is the series of events which led up to the partial meltdown of the reactor core. It was a combination of human error, insufficient training, bad operating procedures and unforeseen equipment failure that culminated in a nuclear accident that could have easily been prevented. (onlineethics.org)

  30. Introduction to Human Computer Interaction • Simulator • The purpose of the simulator is to: • Provide you with the experience of learning how a complex process works as a UI designer would need to learn such a system • Have you learn how to dissect an incident to understand the job of a UI designer in improving the user interface for complex systems • Follow these steps: • Click on the file “Nuclear Plant Simulation” and run the program • Start the first part of the program by first clicking “Start” • Then click on the “Next” button to step through the simulation • After you have completed the simulator for the basic process, click on the “Go to TMI Incident” button • Then click on the “Start” button then the “Next” button to step through the simulator • After completing the TMI simulator, click on the “User/System Interface Events” button to see how the UI designer would log the TMI HCI events

  31. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • L-H interface (liveware to hardware) • Instruments provided faulty information (PORV indicator as shut) • One year prior to TMI, a similar accident occurred where the PORV was stuck open • Operators requested an indicator on the PORV status • What was provided was the “desired” state of the PORV, not the actual state • Radiation indicator in the containment building failed • The twelve-valves indictor was covered up by a maintenance tag • Sump pump light was missed because it was located on the opposite side of the control panel • Operators could not determine the temperature of the coolant in the sump • Extremely high levels would have indicated an uncovered core

  32. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • L-S interface (liveware to software, operators to procedures and system rules) • Operators incorrectly responded by turning off the HPI pumps • HPI pumps were automatically turned-on • The pressure decreased and reactor coolant temperature remained constant • This was the desired state of the system • The operators incorrectly interpreted the procedures inherent to the system • When saturation was reached, the operators interpreted the high pressure to mean that there was ample water in the system • Given the preceding events, the operators should have interpreted the high pressure as resulting from air replacing the water an exposing the core

  33. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • L-S interface (continued) • Given • unusually high levels of neutrons were detected in the core • the temperature and pressure in the containment building continued to rise • The operators should have interpreted this situation as a LOCA • When the pumps started to vibrate • Should have been interpreted as the pumps were pumping steam and water • The fact that steam was being pumped indicated that the reactor’s water was boiling into steam

  34. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • L-E interface (liveware to environment) • Operators ignored relevant feedback from the environment and just focused on displays • Failed to consider the relevance of the hydrogen explosion • L-L interface • Maintenance crew for the twelved-values failed to reopen the values • The operators assume the values were opened • Communication between the maintenance personnel and operators could have avoided the accident • TMI Management • Operators were trained to avoid solid state (too much water in the system) • Operators decreased the flow of water which lead to the LOCA

  35. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • Human Computer Interaction and TMI • Computer to provide support for complex interaction between system parameters • IF • the HPI pumps have been turned-on, and • The pressure is decreasing, and • The reactor coolant temperature is constant • THEN • The system is experiencing a LOCA, and • Leave HPI pumps on • Integrate information related to past system performance and other similar systems • PORV history integrated into problem solving task

  36. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • Human Computer Interaction and TMI • Computer systems advantages • Not influenced by pressures from management • Decision making not influenced by “life-threatening” aspects of the situation • Not get stuck on one problem solving method • At TMI the night-shift operators failed to detect that the PORV was stuck open; however, the day-shift operators (with a fresh perspective)

  37. Introduction to Human Computer Interaction • Human Factors Analysis of the TMI Incident • Useful Tools for an HCI Designer/Analyst • SHEL Model • Supervisory Control (Plan, Monitor, Intervene) • Degrees of Automation • Andre’s Taxonomy of Software Automation • Ten Alternative Meanings of Human-Centered Automation • Grice’s Maxims of Social Etiquette • Rasmussen’s Hierarchy of Behavior • Reason’s Human Error Taxonomy • Neisser’s Perceptual Cycle • Task Analysis

  38. Introduction to Human Computer Interaction • The physical view

  39. Introduction to Human Computer Interaction • The Operators View (metaphor)

More Related