1 / 98

The Evolution of Human-Performance Modeling Techniques for Usability

The Evolution of Human-Performance Modeling Techniques for Usability. Uri Dekel ( udekel@cs.cmu.edu ) Presented in “Methods of Software Engineering”, Fall 2004. Outline. Motivation and scope From early models to GOMS Stimulus-Response-Controller models Information Processing models

keene
Download Presentation

The Evolution of Human-Performance Modeling Techniques for Usability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evolution of Human-Performance Modeling Techniques for Usability Uri Dekel (udekel@cs.cmu.edu) Presented in “Methods of Software Engineering”, Fall 2004

  2. Outline • Motivation and scope • From early models to GOMS • Stimulus-Response-Controller models • Information Processing models • GOMS variants: what to use? • SW tools for GOMS • Lessons learned…

  3. Motivation and Scope

  4. Motivation • Minor timing differences may have a major economic impact • Consider a call center with 100 employees • Average call length = 1 min • 144000 calls per day for entire call center • Improvement of 2 seconds per call • 80 person hours per day • 29200 person hours per year

  5. Where can we optimize? • Moore’s law works for HW and SW • In the past, system reaction time was slow • Databases, networks and GUIs were slow • Now practically instantaneous • Moore’s law does not apply to humans… • But usability has significant impact on performance

  6. Motivation • Problems and solution: • “How to design more usable interfaces?” • Partial solution: usability methods and principles • “How to ensure a design can be used effectively?” • Inadequate solution: use intuition • Inadequate solution: functional prototypes in a design-implement-test-redesign cycle • Expensive and time consuming, especially for hardware • Possible solution: paper prototyping complemented by quantitative models for predicting human performance

  7. Motivation • We need to predict performance on a system which is not yet available • Nielsen, 1993: • “A Holy Grail for many usability scientists is the invention of analytic methods that would allow designers to predict the usability of a user interface before it has even been tested.” • “Not only would such a method save us from user testing, it would allow for precise estimates of the trade-offs between different design solutions without having to build them.” • “The only thing that would be better would be a generative theory of usability that could design the user interface based on a description of the usability goals to be achieved.”

  8. Cognitive modeling • Definition: • producing computational models for how people perform tasks and solve problems, based on psychological principles • Uses: • Predicting task duration and error potential • Adapting interfaces by anticipating behavior

  9. Outside our Scope • Predicting the intent of the user • Model the activities of the user • Relies on AI techniques to make predictions • Useful for intelligent and adaptable UIs • Improves learning curve

  10. Outside our Scope • Predicting the intent of the user • Model the activities of the user • Relies on AI techniques to make predictions • Useful for intelligent and adaptable UIs • Improves learning curve • But not always successful…

  11. Scope • Predicting the usability of the UI • Qualitative models • Will the UI be intuitive and simple to learn? • Is the UI aesthetic and consistent? • Will the user experience be positive? • Quantitative models • How long will it take to become proficient in using the UI? • How long will it take a skilled user to accomplish the task?

  12. Goal of this talk • The goal is NOT: • To introduce you to GOMS and it variants • You got that from the reading • The goal is: • To provide the theoretical foundation and evolution of models which led to GOMS • To show tools that support GOMS • To understand how it could be useful to you

  13. Early models

  14. Stimulus-Response-Controller • Research predates Computer Science • Attempts to improve usability of interactive electronic systems such as control panels, radar displays, air traffic control, etc. • Early models developed by experimental psychology researches in the 1950s • Limited to single short perceptual and motor activities • Based on information and communications theory • Human is a simple device which responds to stimuli by carrying out a motor behavior • Based on Shannon’s definitions of entropy and channel capacity

  15. Information Theory 101: Entropy • Entropy of a random event is a measure of its actual randomness • High entropy if unpredictable:

  16. Information Theory 101: Entropy • Entropy of a random event is a measure of its actual randomness • High entropy if unpredictable: • The winning numbers for this week’s lottery • Same probability for all results • Low entropy if predictable:

  17. Information Theory 101: Entropy • Entropy of a random event is a measure of its actual randomness • High entropy if unpredictable: • The winning numbers for this week’s lottery • Same probability for all results • Low entropy if predictable: • What lunch will be served at the next seminar? • High probability of Pizza. Low probability for Sushi

  18. Information Theory 101: Entropy • Entropy can measure amount of information in message • Consider a message encoded as a string of bits. • Is the next bit 0 or 1 ? • High entropy • What if we add parity bit? • Lower entropy for the parity bit • What if we replicate every bit once? • Even lower for replicated bits

  19. Information Theory 101: Entropy • Formally: • Let x be a random event with n possible values • The entropy of X is:

  20. Information Theory 101: Channel Capacity • Information rate in a perfect channel • n = bits per second • H = entropy per bit • R = nH • R=n if entropy is 1 (pure data) • The channel bandwidth curbs the rate

  21. Information Theory 101: Channel Capacity • Information rate in an analog channel • Curbed by bandwidth and noise • We can fix some errors using different encodings • Is there a limit to how much we can transfer?

  22. Information Theory 101: Channel Capacity • Shannon’s definition of channel capacity • Maximal information rate possible on the channel • For every R<C, there is an encoding which allows the message to be sent with no errors • Theoretical maximum effectiveness of error correction codes • Does not tell us what the code is • Capacity formula: • B = bandwidth • SNR = Signal-to-noise ratio

  23. Fitts’ Law • Paul Fitts studied the human limitation in performing different movement tasks • Measured difficulty of movement tasks in information-metric bits • Movement task is the transmission of information through the “human channel” • But this channel has a capacity

  24. Fitts’ Law • Fitts’ law [1954] predicts movement time from starting point to specific target area • Difficulty index • A = distance to target center, W = target width • Movement time: • a = device dependent intercept • b = device dependent Index of Performance • The coefficients are measured experimentally • e.g., mouse IP lower than stylus, joystick

  25. Fitts’ Law Implications • Primary implication: • Big targets at close distance are acquired faster than small targets at long range • Used to empirically test certain designs • Theoretical rationale for many design principles

  26. Fitts’ Law Implications • Should buttons on stylus based touch screen (e.g., PDA) be smaller, larger or the same as buttons in a mouse based machine?

  27. Fitts’ Law Implications • Should buttons on stylus based touch screen (e.g., PDA) be smaller, larger or the same as buttons in a mouse based machine? • Answer: larger, because it is more difficult to precisely point the stylus (higher index of performance)

  28. Fitts’ Law Implications • Why is the context sensitive menu (“right-click menu” in Windows) located close to the mouse cursor?

  29. Fitts’ Law Implications • Why is the context sensitive menu (“right-click menu” in Windows) located close to the mouse cursor? • Answer: mouse needs to travel shorter distance

  30. Which is better for a context sensitive menu, a pie menu or a linear menu? Fitts’ Law Implications

  31. Which is better for a context sensitive menu, a pie menu or a linear menu? Answer: if all options have equal probabilities, a pie menu. If one option is highly dominant, a linear menu Fitts’ Law Implications

  32. Fitts’ Law Implications • In Microsoft Windows, why is it easier to close a maximized window than to close a regular window?

  33. Fitts’ Law Implications • In Microsoft Windows, why is it easier to close a maximized window than to close a regular window? • Answer: If the mouse cannot leave the screen, target amplitude is infinite in the corner of the screen where the close box is located

  34. Fitts’ Law Implications • Why use mouse-gestures to control applications?

  35. Fitts’ Law Implications • Why use mouse-gestures to control applications? • Answer: A mouse gesture starts at the current location and requires limited movement, compared to acquiring the necessary buttons.

  36. Fitts’ Law limitations • Addresses only target distance and size, ignores other effects • Applies only single-dimensional targets • Later research showed extensions to 2D and 3D • Considers only human motor activity • Cannot account for software acceleration • Does not account for training • Insignificant effect in such low-level operations

  37. Fitts’ Law limitations • Only supports short paths • Research provided methods for complicated paths, using integration • But most importantly: • Operates at a very low level • Difficult to extend to complex tasks

  38. Hick’s Law • Humans have a non-zero reaction time • Situation is perceived, then decision is made • Hick’s law predicts decision time as a function of the number of choices • Humans try to subdivide a problem • Binary rather than linear search • For equal probabilities: • coefficient a measured experimentally • For differing probabilities:

  39. Hick’s Law • Hick’s law holds only if a selection strategy is possible • e.g., alphabetical listing • Intuitive implications: • Split menus into categories and groups • An unfamiliar command should be close to related familiar commands

  40. Hick’s Law Example • The next slide presents a screenshot from Microsoft Word • How fast can you locate the toolbar button for the “WordArt character spacing” command?

  41. Hick’s Law Example

  42. Limitations of the Early Models • Developed before interactive computer systems became prevalent • Use metaphors of analog signal processing • Human-Computer Interaction is continuous • Cannot be broken down into discrete events • Human processing has parallelism

  43. Information Processing Models

  44. Information Processing Models • Developed in the 1960s • Combine psychology and computer science • Humans performs sequence of operations on symbols • Generic structure:

  45. Information Processing Models • Models from general psychology are fitted to the results of actual experiments • Not predictive for other systems • Zero-parameter models can provide predictions for future system • Parameterized only by information from existing systems • e.g., typing speed, difficulty index, etc.

  46. Model Human Processor • [Card, Moral and Newell in 1983] • Framework for zero-parameter models of specific tasks • Humans process visual and auditory input • Output is motor activity • Unique in decomposition into three systems • Each consists of processor and memory • Can operate serially or in parallel • Each with unique rules of operation

  47. Model Human Processor

  48. Model Human Processor Properties • Processor • Cycle time limits amount of processing • Memory • Relatively permanent long term memory • Short term memory • Consists of small activated LTM chunks • There are “seven plus minus two” chunks • Every memory unit has: • Capacity, decay time, information type

  49. Perceptual System • Input arrives from perceptual receptors in outside world • Placed in visual and auditory stores • Stored close to physical form • “bitmap” and “waveforms” rather than symbols • Processor encodes symbolically in stores in LTM • Memory and processing limitations lead to memory loss • Attention directs items to be saved

  50. Cognitive System • Responsible for making decision and scheduling motor operations • Performs a recognize-act cycle • Uses association to activate LTM chunks • Acts by modifying data in working memory • Might require several cycles

More Related