1 / 65

Usability Evaluation Ways of Studying User Interfaces Spring 2019

Usability Evaluation Ways of Studying User Interfaces Spring 2019. Usability Evaluation Types. User-centered E valuation Usability Testing Other studies involving users ( card sorting, eye tracking, quantitative surveys …) Expert-based Evaluation Usability i nspection methods

pharper
Download Presentation

Usability Evaluation Ways of Studying User Interfaces Spring 2019

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability Evaluation Ways of Studying User Interfaces Spring 2019

  2. Usability Evaluation Types • User-centered Evaluation • Usability Testing • Other studies involving users (card sorting, eye tracking, quantitative surveys…) • Expert-based Evaluation • Usability inspection methods Heuristic Evaluation / Cognitive Walkthrough / Pluralistic Walkthrough Heuristic Walkthrough / Perspective-based Evaluation/ … • Review-based evaluation • Model-based Evaluation • GOMS Model (Goals, Operators, Methods, Selection rules) • CMN GOMS (Original version of GOMS by Card, Moran and Newell 1983) • KLM GOMS (Keystroke Level Model GOMS) • NGOMSL (Natural GOMS Language) • CPM GOMS (Cognitive Perceptual Motor / Critical Path Method GOMS) Usability Engineering - Ways of Studying User Interfaces (Part A)

  3. Usability Testing • Evaluation of a product or service by testing it with representative users • (Best way to understand how real users experience a product or service) • Involves systematic observation under controlled conditions • During a test • Participants - try to complete typicaltasks • Observers - watch, listen and take notes. • Main goals: • Collect quantitative and qualitative data • Analyze data to identify any usability problems • Determinethe participant's satisfactionwith the product. Usability Engineering - Ways of Studying User Interfaces (Part A)

  4. What usability testing is not • Usability testing is not simply gathering opinions on a product or service • (which is market research or qualitative research) • Usability testing involves •  watching people trying to usesomething for its intended purpose • rather than • showing users a product and asking, "Do you understand how to use it?” • Example: Testing usability of an instruction pamphletfor a Lego toy • Participants should be given the instruction pamphlet plusa pile of Lego bricks to put the toy together • (rather than being asked to comment on the quality and ease of the instruction pamphlet itself) Usability Engineering - Ways of Studying User Interfaces (Part A)

  5. Usability testing measures Behavior, not Preference • Users are notoriously bad at articulating what they want • Asking a user ‘what do you want’ is rarely -if ever- helpful • By observing and measuring behavior, we can understand what best supportstheir goalsand motivation • Hence: • Best questions to ask: the ones exploring • whatpeople already do • howthey do it • whythey do it “If I had asked people what they wanted, they would have said faster horses.” Henry Ford Usability Engineering - Ways of Studying User Interfaces (Part A)

  6. Usability tests are performed in order to • Learn if participants are able to complete specified tasks successfully • Identify how long it takes to complete specified tasks • Find out how satisfied participants are with the product • Identify changes required to improve user performance and satisfaction • Analyze the performance to see if it meets preset usability objectives Main tasks in performing an effective usability test • Development of the Test Plan • Recruitment of participants • Carrying out the Usability Test • Analysis and report of the findings . Usability Engineering - Ways of Studying User Interfaces (Part A)

  7. Benefits of Usability Testing • Identifies problems users might encounter with a system • The earlier issues are identified, the less expensive the fixing will be • in terms of staff time • in terms of possible impact to the schedule If performed in early stages of development, problems are identified even before they are coded • Decreases support cost • Increases revenues • Makes happier customers • Strengthens brand name Usability Engineering - Ways of Studying User Interfaces (Part A)

  8. Factors affecting the cost of a usability test • Size of the team assembled for testing • Number of participants for testing • Test duration (number of days) • Type of testing performed • Number of Usability Tests Remember: “Building” / “applying” usability on any product is an iterativeprocess Usability Engineering - Ways of Studying User Interfaces (Part A)

  9. Usability Lab • A place where usability testing is taking place. • (*) Facilitator: A person that helps and guides other people towards a goal, with a ‘content neutral’ behavior (neither positive nor negative attitude). In most usability tests, one of the observers also acts as facilitator • An environment where users are studied interacting with a system for the sake of evaluating the system's usability. • The user sits in front of a computer or stands in front of the systems interface, alongside a facilitator(*) who gives the user tasks to perform. Usability Engineering - Ways of Studying User Interfaces (Part A)

  10. Usability Lab • A number of observers watch the interaction, make notes, and ensure the activity is recorded. • Observers are usually in a different room (observing room) and watch user through a camera or one-way mirror • Typically, sessions are videotaped and there is specialized software that logs interaction details. • In some cases the testing and the observing room are not placed alongside. In this case the video and audio observation are transmitted through the network Usability Engineering - Ways of Studying User Interfaces (Part A)

  11. Usability Lab Examples cameras one-way mirror Usability Engineering - Ways of Studying User Interfaces (Part A)

  12. Usability Lab Examples cameras Usability Engineering - Ways of Studying User Interfaces (Part A)

  13. Portable Usability Lab (software testing) • for usability tests on-the-go • Minimum configuration • A full-size laptop for the user equipped with a real mouse (no touchpad) • A second laptop for the facilitator(could be smaller) • A webcam to record the user's thinking aloud comments and facial expressions • Appropriate software (Morae, Camtasia etc) Usability Engineering - Ways of Studying User Interfaces (Part A)

  14. Usability Test Video Example • In this video example the user is presented a university website and is asked to carry out the following three tasks: • 1. Find the name and email of • the Dean of Arts & Sciences • 2. Who is the advisor for public • relationships department • 3. Find the location of the • “Writing Center” • Click on the image to play video • The moderating Technique used for this session is Concurrent Think- Aloud (CTA) • The session was recorded using ‘Morae’, a software environment for recording and processing usability tests. Usability Engineering - Ways of Studying User Interfaces (Part A)

  15. Usability Testing in a Nutshell • Planning / Running / Analyzing / Reporting • 1. Getting Started • Identify the target audience (in general consists of more than one user groups) • 2. Creating Usability Test Tasks • Typically, participants will perform a set of 5 to 10 tasks within a 90-minute session • Tasks should represet the most common user goals.  If a website is tested, tasks may also represent the most important conversion goals from the owner’s perspective (e.g. making a purchase) • Conversion: any action taken by a user that satisfies the website owner’s business goals. • making a purchase, signing up for an email newsletter, viewing an important webpage etc. Usability Engineering - Ways of Studying User Interfaces (Part A)

  16. Usability Testing in a Nutshell - 2 • 3. Conducting a Usability Test • The researcher reads a participant one task at a time, such as “Find out how to contact technical support,” and allows the participant to complete the task without any guidance. • To prevent bias, the researcher follows the same “script” when explaining the task to each participant. • The researcher may also ask the participant to talk aloud as he works on a task to better understand the participant’s mental model for the task and his decision-making in real time. • When the participant has completed a task, the researcher sets up the starting point for the next task and continues the test. • Ideally, task order is counterbalanced from participant to participant. • Counterbalancing: A technique in experimental design that is used to avoid the introduction of confounding variables. In usability testing, this technique is most commonly used when establishing task order. (find the sitemap / sign up for the email newsletter) Usability Engineering - Ways of Studying User Interfaces (Part A)

  17. Usability Testing in a Nutshell - 3 • 4. Usability Test Analysis • Usability testing recording software may be used to record the computer screen and the participant’s voice and facial expressions during testing. Morae, Camtasia… • This software can also facilitate tracking of user behaviors, including mouse clicks, keystrokes, and active or open windows. • When all participants have completed the study, the researcher will compile the data to determine the severity of each usability issue that was encountered and provide prioritized recommendations for the development team to meet usability requirements. • Usability testing should be conducted at various times throughout the iterative design process to ensure that all usability requirements have been met in the final product. Usability Engineering - Ways of Studying User Interfaces (Part A)

  18. Planning a Usability Test • Elements of a Test Plan • Scope • Indicate what you are testing • Specify how much of the product the test will cover e.g. prototype as of a specific date / navigation /navigation and content … • Purpose - Identifyconcerns, questions, and goalsfor this test • can be broad or specific “can users navigate to important information from the prototype’s home page?” “will users easily find the search box in its present location?”  • concerns drive the scenarios you choose for your tests • Schedule & Location • where / when / how long Usability Engineering - Ways of Studying User Interfaces (Part A)

  19. Planning a Usability Test • Elements of a Test Plan – continued • Participants • number of users: equallydistributed in classes • types (classes) of users 3 - 4 different classes usually enough to cover most of the target audience • Sessions • one per user • description: Session ID / class / userID (no real names!) • length: typically 60’ to 90 minutes • allow 30 minutes buffer time between sessions >reset the environment, > briefly review the session > compensate for late (or early) arrival of next participant Usability Engineering - Ways of Studying User Interfaces (Part A)

  20. Planning a Usability Test • Elements of a Test Plan – continued • Equipment & stuff • user’s equipment: desktop / laptop / smartphone / tablet… • evaluator’s equipment: PC(s), camera(s), • forms, paper, office supplies, coffee, water, small gifts of gratitude… • Scenarios • number and types of tasks in each session • typical time per task: 5’ to 10’ • Metrics • subjective: questions to be asked at various stages of the testing • quantitative: successful completion rates, error rates, time on task… • Roles • list of staffwho will participate and what roleeach will play facilitator, evaluators, staff leader…. Usability Engineering - Ways of Studying User Interfaces (Part A)

  21. Planning a Usability Test • Examples of Quantitative Test Metrics • Time on task: amount of time it takes the participant to complete the task • Number and rate of successful & unsuccessful task completions • a scenario is successfully completed when the participant indicates they have found the answer or completed the task goal. • Number and rate of critical andnon-critical errors • Critical: inability to complete a task of the scenario • Non- critical: recovered by the participant and do not result in the participant’s ability to successfully complete the task • Error-free rate: percentage of test participants who complete the task without any errors Usability Engineering - Ways of Studying User Interfaces (Part A)

  22. Planning a Usability Test • Examples of Subjective Test Metrics • Self-reported Participant Ratings • satisfaction / ease of use / ease of finding information etc. • rated on a 5 to 7-point Likert scale • Likes, dislikes and recommendations (open-end questions) Usability Engineering - Ways of Studying User Interfaces (Part A)

  23. Running a Usability Test • Choosing a Moderating Technique • Concurrent Think Aloud (CTA) Participants “think aloud” while they interact with the product. Encourages a running stream of consciousness as they work. • Retrospective Think Aloud (RTA) Participants are asked to retrace their steps when the session is complete. Often participants watch a videoreplayof their actions • Concurrent Probing (CP) As participants work on tasks—when they say something interesting or do something unique, the researcher asks follow-up questions • Retrospective Probing (RP) Waiting until the session is complete and then asking questions about the participant’s thoughts and actions for specific points of interest Usability Engineering - Ways of Studying User Interfaces (Part A)

  24. Running a Usability Test • Example of a Typical Usability Test Session • The facilitator welcomes the participant and explain the test session, ask the participant to sign the “consent form”, and ask any pre-test or demographic questions. • The facilitator explains the moderating technique (e.g. CTA) and asks if the participant has any additional questions. The facilitator explains where to start. • The participant reads the task scenario aloud and begins working on the scenario • The other members of the staff take notes of the participant’s behaviors, comments, errors and completion (success or failure) on each task. Usability Engineering - Ways of Studying User Interfaces (Part A)

  25. Running a Usability Test • Example of a Typical Usability Test Session - continued • The session continues until all task scenarios are completed or a predefined timelimit has reached time limits may also exist for each individual task scenario to avoid user frustration and deadlock • The facilitator asks the end-of session subjective questions (de-briefing), thanks the participant, gives him a small token of appreciation and escorts him out. • The facilitator then resets the materials and equipment, speaks briefly with the observers and calls in the next participant Usability Engineering - Ways of Studying User Interfaces (Part A)

  26. Analyzing Usability Test Data • Use Spreadsheets to record and analyze data for each task scenario • Success rates • Task time • Critical / non critical errors • “Quantitated” qualitativedata (as in Likert scales) • Participant’s demographic data to observe variations due to demographic variables (sex, age, occupation, education…) • Categorize detected usability problems by severity • Critical: If not fixed, users will not be able to complete the scenario • Serious: If not fixed, many users will be frustrated and some may give up • Minor: Users are annoyed, but this does not keep them from completing the scenario. This should be revisited later. Usability Engineering - Ways of Studying User Interfaces (Part A)

  27. Writing the Usability Test Report • Background Summary: • what has been tested • where and when • equipment information • actions performed before, during and after the test • the testing team • brief description of problems encountered • brief description of what workedwell • Methodology: • description of test sessions • type of interface between user and testing team • metrics collected • overview of taskscenarios • overview of participants Usability Engineering - Ways of Studying User Interfaces (Part A)

  28. Writing the Usability Test Report • Test Results: • summary of the successful task completion rates • by participant • by task • average success rate • by participant • by task • percentage of participants who completed each scenario • percentage of participants who completed all scenarios • average time taken to complete each scenario • participant comments (most illustrative of them). Usability Engineering - Ways of Studying User Interfaces (Part A)

  29. Writing the Usability Test Report • Findings and Recommendations: • three reporting options • a single overall list of findings and recommendations • findings and recommendations scenario by scenario • both a list of major findings and recommendations that cut across scenarios as well as a scenario-by-scenario report. • each finding (or group of related findings) should: • have a basis in data (what you actually saw, heard and noted) • include a statement of the situation (as specific as possible) • include recommendations on what to do. Useful to also report positivefindings. What is working well must be maintained through further development. Usability Engineering - Ways of Studying User Interfaces (Part A)

  30. Writing the Usability Test Report • What Not to include • The names of participants refer to them as User A, User B… • Descriptions of participants from which a reader could deduce identities • Generalized conclusions that are misleading due to poor sampling or other factors • Testhypotheses / unsubstantiatedopinions/ biases • Entire listings of all of the raw data and observations. (supply only those necessary, and provide summaries as needed) Usability Engineering - Ways of Studying User Interfaces (Part A)

  31. Ethical Issues in Usability Work involving Users The important ethical concern is: Users feel: Pressure to conform Sometimes, highly specialized experts feel particularly stressed Inadequateor stupid when they makeerrors Worried about what might happen to information being gathered about them Concerned that information falling into the wrong hands can influence managerial decision-making Avoid management involvement in usability sessions • Do no harm to the users. • What harm can be done: Mostly psychological Usability Engineering - Ways of Studying User Interfaces (Part A)

  32. What must you as a Usability Engineer do when working with Users? Before the session Make appropriate preparations and checks Explaincertain things to users and answerto possible questions During the session Observeusers and behave appropriately After the session Debriefand express gratitude to users (thank you, small gift) Debriefing: a process of questioning to gain information from an individual after a mission, and to instruct the individual as to what information can be released to the public and what information is restricted. (military origin) Above points apply both to when measuring or merely observing (whether recording or not) Usability Engineering - Ways of Studying User Interfaces (Part A)

  33. Things to prepare and check before the session  Learn the software yourself Have a moderate ability with most features so as to better understand what users are doing preparetasks help users if they are stuck  Decide on Whether you will be measuring time (e.g. to gather metrics) or just observing to find problems Whether to focus on learnability or efficiency of use Effects choice of participants Whether to have longer tasks or a series of very short tasks Level of interaction with participants during sessions Minimize interaction if you are timing the tasks, otherwise think-aloud interaction is best Usability Engineering - Ways of Studying User Interfaces (Part A)

  34. Things to prepare and check before the session (continued)  Conduct high level task analysis Make sure tasks are carefully designed so as to be understandable and within user capabilities Run a pilot test, preferably with another member of your team  Select participants (well in advance) Plan for more participants than you really need in case one or more backs out  Prepare “informed consent forms” for participants to sign(more later)  Make sure the test software and recording equipment actually works So as not to frustrate the user - let alone yourself Usability Engineering - Ways of Studying User Interfaces (Part A)

  35. Things to explain to users before the session  Tell users about the work in advance. Explain (orally and in writing): The nature and goals of the evaluation How longit will take What recordingor measurement is being done What will happen to the data The fact that there are no known risks to them (presumably) If in a company: That their manager has given them permission to participate, but he/she will not see individual results. That they are free to participate or not, and to withdraw at any time Any other rights (e.g. the right to see the results) Who they can complainto  Give users to sign informed consent forms in particular that they are not being evaluated as individuals Usability Engineering - Ways of Studying User Interfaces (Part A)

  36. Informed Consent Form Example Usability Engineering - Ways of Studying User Interfaces (Part A)

  37. Things to do During the session • Keep the environment relaxed. Leave plenty of free time • Give the users one or more 'easy' tasks to do first, for warm-up • easy tasks boost confidence. (exclude warm-up tasks from results) • Hand the user test tasks one at a time • so users don't feel overwhelmed about having too many things to do • It is not their fault if they are slow • For each task: • Readit to the user • Hand it to the user so they can re-read when necessary • Ask the user to speak out loud, explaining what they are doing and why not all users are good at this, so be patient remind user from time to time to keep talking Usability Engineering - Ways of Studying User Interfaces (Part A)

  38. During the session - continued Avoid coaching the user, or letting him/her struggle endlessly Try to balance. Either extreme can make the user feel inadequate Periodically, pose questions to stimulate the user to give useful information (see next slide) Refrain from laughing or making negative comments about what the user is doing Ensure there are no disturbances, and no 'audience'. Limit sessions to 1.5 hours or less with a 10 minute break in the middle Don't use the same participant more than once in a day Provide coffee etc. for sessions longer than an hour. Terminate the session if the user becomes overly frustrated. Usability Engineering - Ways of Studying User Interfaces (Part A)

  39. During the session - continued Questions to stimulate the user What do you want to do? They do not know; the system cannot do what they want. What do you think would happen if ...?  They do not know; they give wrong answer. What do you think the system has done? They do not know; they give wrong answer. What do you think is this information telling you? They do not know; they give wrong answer Why did the system do that? They do not know; they give wrong answer. What were you expecting to happen? They had no expectation; they were expecting something else. – Question Problem if ... Usability Engineering - Ways of Studying User Interfaces (Part A)

  40. After the session Debriefthe user Let them tell you anything they want Ask them specific questions about difficulties they had Tell them that they helped you find problems and that their assistance is appreciated Tell them thatit may not be possible to fix all the problems, but you will try. Remind them that the results will be confidential Consider giving them a token of appreciation (pin, thank-you note etc.) Keep the data confidential Refer to users as 'User 1', 'User 2' etc. as long as nobody can 'read between the lines' and infer who is who When asking users for permission to publish specific video clips, be very careful to ensure they are freely agreeing. Usability Engineering - Ways of Studying User Interfaces (Part A)

  41. The Importance of Video Recording Really important to see emotional responses and hear users’ feedback Without it, 'you see only what you want to see’ (i.e you interpret what you see based on your mental model) In the 'heat of the moment' you miss many things Minor details (e.g. body language) captured You can repeatedly analyze, looking for different problems Usability Engineering - Ways of Studying User Interfaces (Part A)

  42. Usability Evaluation Types • User-centered Evaluation • Usability Testing • Other usability studies (card sorting, eye tracking, quantitative surveys…) • Expert-based Evaluation • Usability inspection methods Heuristic Evaluation / Cognitive Walkthrough / Pluralistic Walkthrough (*) Heuristic Walkthrough / Perspective-based Inspection / … • Review-based evaluation • Model-based Evaluation • GOMS Model (Goals, Operators, Methods, Selection rules) • CMN GOMS (Original version of GOMS by Card, Moran and Newell 1983) • KLM GOMS (Keystroke Level Model GOMS) • NGOMSL (Natural GOMS Language) • CPM GOMS (Cognitive Perceptual Motor / Critical Path Method GOMS) Usability Engineering - Ways of Studying User Interfaces (Part A)

  43. Usability Inspection Methods Family of methods used by experienced practitioners to assess usability issues • Analytic techniques - no users involvement • Generate results for a fraction of the time and cost of usability testing. • Used to supplement, NOT replace, direct user involvement • Members of the Usability Inspection Methods family • Heuristic Evaluation • Cognitive Walkthrough • Pluralistic Walkthrough • Heuristic Walkthrough • Perspective Based Evaluation • Review Based Evaluation most prominent Usability Engineering - Ways of Studying User Interfaces (Part A)

  44. Heuristic Evaluations / Cognitive Walkthroughs • Also referred to as Expert Reviewsbecause they are usually performed by experts in Usability or HCI. • Best results obtained when • evaluators are Double Experts (have expertise in bothHCI and the specific domain of the application) • evaluators go through the interface at least twice • First pass: get a feel for the flow of the interaction and the general scope of the system. • Second pass: focus on specific interface elements while knowing how they fit into the larger whole. Heuristic Evaluations and Cognitive Walkthroughs should be done prior to and in addition to user-testing, not instead of user-testing Usability Engineering - Ways of Studying User Interfaces (Part A)

  45. Heuristic Evaluation • Usually part of an iterative design process • Basis for evaluation: a specific list of design guidelines, frequently called ‘heuristics’. • System reviewed by a small number of evaluators - often UI experts or designers • Evaluators comment on usability problems in relation to each heuristic • Evaluators decide on their own how they want to proceed with evaluating the interface How many evaluators? • The more the best – obviously • But: if costis taken into account, we must find the optimal number Usability Engineering - Ways of Studying User Interfaces (Part A)

  46. Lists of Heuristics • There is not a single set of Heuristics. • Nielsen's 10 famous heuristics (remember them?) the ones mostly used but there are also others: • Gerhardt-Powals 10 Cognitive Engineering Principles • Bastien and Scapin 18 Ergonomic criteria • Connell & Hammond's 30 Usability Principles • Smith & Mosier's 944 guidelines for the design of user-interfaces • and more… Usability Engineering - Ways of Studying User Interfaces (Part A)

  47. Original List of Nielsen Heuristics • Proposed by Nielsen and Molich in 1990 – contains 9 items • (Replaced in 1993 by a 10 item revised list ) • 1. Simple and natural dialog • Simple means no irrelevant or rarely used information. Natural means an order that matches the task. • 2. Speak the user’s language • Use concepts from the user’s world Don’t use system-specifc engineering terms. • 3. Minimize user memory load • Don’t make the user remember things from one action to the next. • Leave information on the screen until it is no longer needed. • 4. Be consistent • Action sequences learned in one part of the system should apply in other parts. Usability Engineering - Ways of Studying User Interfaces (Part A)

  48. Original List of Nielsen Heuristics - 2 • 5. Provide feedback • Let users know what effect their actions have on the system. • 6. Provide clearly marked exits. • If users get into a part of the system that doesn’t interest them, they should be able to get out quickly without damaging anything. • 7. Provide shortcuts • Help experienced users avoid lengthy dialogs and informational messages they don’t need. • 8. Good error messages • Let the user know what the problem is and how to correct it. • 9. Prevent errors • Whenever you discover an error message; ask if that error could have been prevented. Usability Engineering - Ways of Studying User Interfaces (Part A)

  49. Revised List of Nielsen Heuristics (1993) • Most common list of heuristics in use today • 1. Visibility of system status • The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. • 2. Match between system and real world. • The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. • 3. User control and freedom • Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. • 4. Consistency and standards • Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. • 5. Error prevention • Even better than good error messages is a careful design, which prevents a problem from occurring in the first place. Usability Engineering - Ways of Studying User Interfaces (Part A)

  50. Revised List of Nielsen Heuristics (1993) - 2 • 6. Recognition rather than recall • Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. • 7. Flexibility and efficiency of use • Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions • 8. Aesthetic and minimalist design • Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. • 9. Help users recognize, diagnose and recover from errors. • Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. • 10. Help and documentation • Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large. Usability Engineering - Ways of Studying User Interfaces (Part A)

More Related