1 / 13

Usability and Evaluation

Usability and Evaluation. Dov Te’eni. Figure ‎ 7-2: Attitudes, use, performance and satisfaction Attitudes Use Performance Satisfaction Perceived usability & behavior). Perceived usefulness (intentions. Usability indicators based on performance.

hou
Download Presentation

Usability and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability and Evaluation Dov Te’eni

  2. Figure ‎7-2: Attitudes, use, performance and satisfaction Attitudes Use Performance Satisfaction Perceived usability & behavior) Perceived usefulness (intentions

  3. Usability indicators based on performance        1.  Goal achievement indicators (success rate, failure rate, accuracy, effectiveness).         2.  Work rate indicators (speed, completion rate, efficiency, productivity, productivity gain).         3.  Operability indicators of the user's ability to make use of the systems features (error rate, problem rate, function usage)         4.  Knowledge acquisition indicators of the user's ability and effort in learning to use the system (learnability and learning period).

  4. Usability measures based on performance and HCI   1.  Time to complete a task   2.  Number of user commands to complete task   3.  Fraction of task completed   4.  Fraction of task completed in a given time   5.  Number of errors   6.  Time spent on errors   7.  Frequency of online help used

  5. Usability measures based on performance and HCI (cont.) 8. Number of available commands not used   9.  When task is repeated, ratio of successes to failures 10.Fraction of positive comments made by user 11.Fraction of good to bad features recalled by user 12.Number of expressions of frustration and satisfaction 13.Number of times user loses control over system 14. Number of times the user needs to devise a way of working around the problem/system.

  6. The usability engineering life cycle ·   Know the user ·   Analyze competing products ·   Set usability goals ·   Consider alternative designs ·   Engage in participatory design ·   Coordinate the total interface ·   Check against heuristic guidelines ·   Prototype ·   Evaluate interface ·   Design in iterations * Follow up with studies of installed systems

  7. Attitude questionnaire 1.Using a microcomputer could provide me with information that would lead to a better decisions. 2. I wouldn't use a microcomputer because programming it would take too much time. 3. I'd like to use a microcomputer because it is oriented to user needs. 4. I wouldn't use a microcomputer because it is too time consuming. 5. Using a microcomputer would take too much time away from my normal duties. 6. Using a microcomputer would involve too much time doing mechanical operations (e.g., programming, inputting data) to allow sufficient time for managerial analysis. 7. A microcomputer would be of no use to me because of its limited computing power. 8. I'd like to learn about ways that microcomputers can be used as aids in managerial tasks. 9. Using a microcomputer would result in a tendency to over design simple tasks. 10. I wouldn't want to have a microcomputer at work because it would distract me from my normal job duties.

  8. 1.A microcomputer would give me more opportunities to obtain the information that I need 2. I wouldn't favor using a microcomputer because there would be a tendency to use it even when it was more time consuming than manual methods. 3. I'd like to have a microcomputer because it is so easy to use. 4. I'd hesitate to acquire a microcomputer for my use at work because of the difficulty of integrating it with existing information systems. 5. I'd discourage my company from acquiring microcomputers because most application packages would need to be modified before they could be useful in our specific situation. 6. It is easy to access and store data in a microcomputer. 7. A microcomputer would be of no use to me because of the limited availability of application program packages. 8. A microcomputer would be of no use to me because of its small storage capacity. 9. It is easy to retrieve or store information from/to a microcomputer. 10. Using a microcomputer would give me much greater control over important information.

  9. Heuristic guidelines 1. Create simple and natural dialog2. Speak the user's language3. Minimize the user's memory load4. Be consistent5. Provide feedback6. Provide clearly marked exits7. Provide shortcuts8. Provide specific, corrective and positive error messages9. Minimize propensity for error

  10. Evaluation techniques classified 1) Exploratory vs. model based. 2) Design or implementation. 3) Field study vs. laboratory testing. 4) Design vs. use. 5) Level of performance measures. 6) Degree of designed manipulation and intrusion.

  11. Cognitive walkthrough start up (from Polson) Task description from the first-time user’s viewpoint. Include any special assumptions about the state of the system assumed when the user begins work. Action sequence: Make a numbered list of the atomic actions that the user should perform to accomplish the task. Anticipated users: Briefly describe the class of users who will use this system. Note what experience they are expected to have with similar or previous versions. User’s initial goals: List the goals the user is likely to form when starting the task. If there are other likely goals list them, and estimate for each what percentage of user are likely to have them.

  12. Measures for comparing displays ·        Typing mistakes made before correction ·        Mistakes remaining ·        Pauses of 3 seconds or more immediately before mode change ·        Other pauses of 3 seconds or more ·        Length of pause immediately before mode change ·        Attempting to type-over whilst in insert mode ·        Attempting to insert whilst in type-over mode

  13. ) Effectiveness = (Quantity * Quality) % User effectiveness Expert task time 2) Relative efficiency ----------------------- * ------------------- % Expert effectiveness User task time 3) Productive period = [ (Task time-Total problem time - Learning time) ]/ Task time 4) Problem rate = Number of problems encountered / Task time 6) Complexity factor = Number of calls for assistance Learning time +Total problem time ------------------------------------ + ---------------------------------------- Number of actions undertaken Task time

More Related