1 / 38

Ch 13. Asking Users & Experts

Ch 13. Asking Users & Experts. Team 3: Jessica Herron Lauren Sullivan Chris Moore Steven Pautz. Asking Users and Experts: Introduction. We want to find out what users do, want to do, like, or don’t like Several tools & techniques are available Interviews Questionnaires

malin
Download Presentation

Ch 13. Asking Users & Experts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ch 13. Asking Users & Experts Team 3: Jessica Herron Lauren Sullivan Chris Moore Steven Pautz

  2. Asking Users and Experts: Introduction • We want to find out what users do, want to do, like, or don’t like • Several tools & techniques are available • Interviews • Questionnaires • Heuristic Evaluation • Cognitive Walkthroughs • Strengths/Weaknesses of these tools

  3. Interviews

  4. Interview Basics • Remember the DECIDE framework • Determine Goals, Explore Questions, Choose eval paradigm, Identify issues, Decide on ethical issues, and Evaluate data. • Goals and questions guide all interviews • Two Types of Questions:

  5. Avoid Interview Pitfalls • Long questions • Compound sentences - split into two • Jargon & language that the interviewee may not understand • Leading questions that make assumptions e.g., why do you like …? They might not actually like the product. • Be aware of unconscious biases • e.g., gender stereotypes

  6. Interview Components

  7. The Interview Process • Dress in a similar way to participants • Check recording equipment in advance • Devise a system for coding names of participants to preserve confidentiality. • Be pleasant • Ask participants to complete an informed consent form

  8. Group interviews • Also known as focus groups • Typically 3-10 participants • Provide a diverse range of opinions • Need to be managed to:- ensure everyone contributes- discussion isn’t dominated by one person- the agenda of topics is covered, eliminate digressions.

  9. Other Sources • Telephone Interview • Can’t see participant’s body language or nonverbal clues. • Online Interviews • Asynchronous: E-mail or Synchronous: Chat Room • Retrospective Interview • Check with the participant to be sure interviewer correctly understood their remarks.

  10. Analyzing interview data • Depends on the type of interview • Structured interviews can be analyzed like questionnaires • Unstructured interviews generate data like that from participant observation • It is best to analyze unstructured interviews as soon as possible to identify topics and themes from the data

  11. Links • UPA – (Usability Professionals Association) • Discovering User Needs Discovering User Needs • Sage Services - User Research & Usability Techniques Available Through Sage.

  12. Questionnaires • Questions can be closed or open • Closed questions are easiest to analyze, and may be done by computer • Can be administered to large populations • Paper, email & the web used for dissemination • Advantage of electronic questionnaires is that data goes into a data base & is easy to analyze • Sampling can be a problem when the size of a population is unknown as is common online

  13. Questionnaire Style • Questionnaire format can include:- ‘yes’, ‘no’ checkboxes- checkboxes that offer many options- Likert rating scales- semantic scales- open-ended responses

  14. Designing Questionnaires • Make questions clear and specific. • When possible, ask closed questions and offer a range of answers. • Consider including a ‘no-opinion’ option for questions that seeks opinions. • Think about the ordering of questions. • Avoid complex multiple questions.

  15. Designing Questionnaires • When scales are used, make sure the range is appropriate and does not overlap. • Make sure the ordering of scales is intuitive and consistent. • Avoid jargon. • Provide clear instructions on how to complete the questionnaire. • Make a balance between white space and the need to keep the questionnaire compact.

  16. Encouraging a good response • Ensure the questionnaire is well designed so that participants do not get annoyed. • Provide a short overview section. • Include a stamped, self-addressed envelope. • Contact respondents through a follow-up letter. • Offer incentives. • Explain why you need the questionnaire to be completed.

  17. Online Questionnaires • Advantages: • Responses received quickly. • Copying and postage costs lower. • Data can be transferred immediately to database. • Time required is reduced. • Errors in questionnaire can be corrected easily.

  18. Online Questionnaires • Disadvantages: • Sampling is problematic if population size is unknown. • Preventing individuals from responding more than once. • Individuals have also been known to change questions in email questionnaires

  19. Developing Web-based Questionnaire • Steps: • Produce an error-free interactive electronic version from the original paper-based one. • Make the questionnaire accessible from all common browsers and readable for different sized monitors and network locations. • Make sure information identifying each respondent will be captured and stored confidentially. • User-test the survey with pilot studies before distributing.

  20. Asking experts • Experts use their knowledge of users & technology to review software usability • Expert critiques (crits) can be formal or informal reports • Heuristic evaluation is a review guided by a set of heuristics • Walkthroughs involve stepping through a pre-planned scenario noting potential problems

  21. Heuristic evaluation • Developed by Jacob Nielsen in the early 1990s • Based on heuristics distilled from an empirical analysis of 249 usability problems • These heuristics have been revised for current technology, e.g., HOMERUN for web • Heuristics still needed for mobile devices, wearables, virtual worlds, etc. • Design guidelines form a basis for developing heuristics

  22. Nielsen’s heuristics • Visibility of system status • Match between system and real world • User control and freedom • Consistency and standards • Help users recognize, diagnose, recover from errors • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help and documentation

  23. HOMERUN • High-quality content • Often updated • Minimal download time • Ease of use • Relevantto users’ needs • Unique to the online medium • Netcentric corporate culture

  24. Discount evaluation • Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. • Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

  25. 3 stages for doing heuristic evaluation • Briefing session to tell experts what to do • Evaluation period of 1-2 hours in which:- Each expert works separately- Take one pass to get a feel for the product- Take a second pass to focus on specific features • Debriefing session in which experts work together to prioritize problems

  26. Heuristic evaluation of websites • MEDLINEplus – medical information website • Keith Cogdill, commissioned by NLM, evaluated this website • Based evaluation on the following: • Internal consistency • Simple dialog • Shorcuts • Minimizing the user’s memory load • Preventing errors • Feedback • Internal locus of control

  27. MEDLINEplus suggestions for improvement • Arrangement of health topics • Arrange topics alphabetically as well as in categories • Depth of navigation menu • Having a higher “fan-out” in the navigation would enhance usability…meaning many short menus rather than a few deep ones.

  28. Turning guidelines into heuristics for the web • Navigation, access and information design are the three categories for evaluation • Navigation • Avoidance of orphan pages • Avoidance long pages with excessive white space (scrolling problems) • Provide navigation support • Avoidance of narrow, deep, hierarchical meus • Avoidance of non-standard link colors • Provide a consistent look and feel

  29. Turning guidelines into heuristics for the web (cont.) • Access • Avoidance of complex URLs • Avoid long download times that annoy users • Pictures • .wav files • Large software files • Information Design • Have good graphical design • No outdated or incomplete information • Avoid excessive color usage • Avoid gratuitous use of graphics and animation • Be consistent

  30. Advantages and problems • Few ethical & practical issues to consider • Can be difficult & expensive to find experts • Best experts have knowledge of application domain & users • Biggest problems- important problems may get missed- many trivial problems are often identified

  31. Walkthroughs • Alternative approach to heuristic evaluation • Expert walks through task, noting problems in usability • Users are usually not involved • Several forms: • Cognitive Walkthrough • Pluralistic Walkthrough

  32. Cognitive Walkthroughs • Involves simulating the user’s problem-solving process at each step in the task • Focus on ease of learning • Examines user problems in detail, without needing users or a working prototype • Can be very time-consuming and laborious • Somewhat narrow focus • Adaptations may offer improvements

  33. Cognitive Walkthrough Steps • One or more experts walk through the design description or prototype with a specific task or scenario • For each task, they answer three questions: • Will users know what to do? • Will users see how to do it? • Will users understand from feedback whether the action was correct or not?

  34. Cognitive Walkthrough Steps • A record of critical information is compiled • Assumptions about what would cause problems and why, including explanations for why users would face difficulties • Notes about side issues and design changes • Summary of results • Design is revised to fix problems presented

  35. Pluralistic Walkthroughs • Variation of Cognitive Walkthrough • Users, developers, and experts work together to step through a scenario of tasks • Strong focus on users’ tasks • Provides quantitative data • May be limited by scheduling difficulties and time constraints

  36. Pluralistic Walkthrough Steps • Scenarios are developed in the form of a series of screens • Panel members individually determine the sequence of actions they would use to move between screens • Everyone discusses their sequence with the rest of the panel • The panel moves to the next scenario

  37. Asking Users and Experts: Summary • Various techniques exist for getting user opinions • Structured, unstructured, and semi-structured • Interviews, focus groups, questionnaires • Expert evaluations: heuristic evaluation and walkthroughs • Various techniques are often combined for added accuracy and usefulness

  38. Interview: Jakob Nielsen • “Father of Web Usability” • Pioneered heuristic evaluation • Demonstrated cost-effective methods for usability testing & engineering • Author of several acclaimed design books • www.useit.com • www.nngroup.com

More Related