1 / 17

Web Usability: Data Analysis & Progress Report

Web Usability: Data Analysis & Progress Report. English 407B Julie Staggers. Usability testing: The process. Determine the goals of the study Develop a profile of typical users Write the user tasks Plan the test Conduct user tests Evaluate the data

bfair
Download Presentation

Web Usability: Data Analysis & Progress Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web Usability: Data Analysis & Progress Report English 407B Julie Staggers

  2. Usability testing: The process • Determine the goals of the study • Develop a profile of typical users • Write the user tasks • Plan the test • Conduct user tests • Evaluate the data • Recommend or implement changes in your final report

  3. Where we’re headed • Today, start evaluating data and work on Progress Report • Next week, start writing report • Looking ahead, you might find it helpful to look at several examples of professional usability reports before you get started with your own development process • Visa Usability Report • OneStart Portal Usability Report • Report from a 1994 Usability Study

  4. Time to make sense of your data • Step 1: Compile your findings*– 2 ways to group generally: • By tasks • By answers to questions in your test protocol • Step 2: Look for significant patterns • What did most users have the most difficulty with? • How severe were their difficulties? • How satisfied were they with aspects you were testing? • Step 3: Brainstorm solutions or fixes** & support with advice/research from the experts • Note: As you start to discover patterns in the results of your user tests, consult Web site design principles for expert opinions on how to improve these elements. *Info will go into the “Findings” section of your final report **Becomes “Recommendations” section of your final report

  5. Analyzing your data • What data will you have? • What do you do with quantitative data? • What do you do with qualitative data? • How do you know what are the most important results? • Source: Usability.gov “Analyze the Results: Test and Refine” http://www.usability.gov/refine/results.html

  6. What data will you have? • Quantitative (numeric, measurable) data such as: • success rates • time to complete tasks • pages visited • error rates • ratings on a satisfaction questionnaire • Qualitative (non-numeric, human data) such as: • notes of your observations about the pathways participants took • notes about problems participants had • notes of what participants said as they worked • participants' answers to open-ended questions

  7. What to do w/quantitative data? • Try using a spreadsheet – lets you • quickly see patterns in the data • use formulas to calculate information such as: • percentage of participants who succeeded or not at each task • average time to complete tasks • average number of pages visited in each task • frequency of specific problems

  8. Did different participants have very different results? • Differences might correlate to some of the demographic factors in your pre/post test questions • One way to turn the mishmosh into data: • Add columns to your spreadsheet for each participant's answers to demographic questions • Sort by demographics and see if the other data correlate with specific demographic features

  9. What to do w/qualitative data? • Read through all the notes carefully looking for patterns • Create a second spreadsheet for quantitative data • Each row describes one problem • Include one column for: • each one of the participants. (Put a "yes" or a 1 for each participant who had the problem. ) • notes/comments so you can make comments as you go through the data (to remind yourself of what a participant did or as an insight for your report). • indicating which task/scenario the information in that row of the spreadsheet came from so that you can report by task/scenario if that would be useful to the development team.

  10. Turn notes from observations into problem statements • When you begin to analyze the data, be as specific as you can about what the participant did • A good problem statement tells what the participant did and contrasts that with what the participant should have done • Good problem statement: Clicked on link to Research instead of Clinical Trials. • Poor problem statement: Clicked on wrong link. • Poor problem statement: Was confused about links. • Look for patterns in your problem statements so you can report on patterns with the specific statements as examples of the pattern

  11. How do you know which results are most important? • As you review the results, consider: • how "global" the problem is throughout the site • focus on solving global problems first • consolidate recommendations for fixing isolated problems • how severe (or serious) the problem is • focus on solving problems that prevent or seriously hamper user’s ability to use the system, complete tasks, etc. first • consolidate recommendations for problems that are irritants but don’t keep users from completing specific tasks

  12. Local vs. global problems • The scenarios you used in the test are only sampling the site. • Can’t test every possible pathway or every page. • Consider whether what you’ve learned affects pathways or pages you did not see/use during the test • If so, give the problem higher priority when finding solutions, making recommendations • If not, save until later

  13. For example… If you find in one scenario that participants can’t find what they need in the content because the page is so crammed with text that they won’t or can’t read it, you could say that specific page has problems and needs to be fixed. If it’s the only jam-packed page, it’s a local problem. But you should also consider how many other pages in the site are equally dense with text. If you had chosen a scenario that led to those other pages, how likely is it that you would have gotten the same results? Your finding may have implications for many other pages. Overly dense text may be a global problem within the site.

  14. Levels of severity • Some problems contribute more to participants being unable to complete the scenarios than others – fix these first • Rank problems you’ve identified on a 3- or 4-point scale for example: • Show stopper: If we don't fix this, users will not be able to complete the scenario. • Serious: Many users will be frustrated if we don't fix this; they may give up. • Minor: Users are annoyed, but this does not keep them from completing the scenario.

  15. Progress Report: Overview • Progress Report is due at end of next class • Each team is responsible for submitting a group-written progress report that presents the results of the usability tests. • It should be • written in memo format • not exceed four pages • include at least two visuals (table, graph, chart, or picture/screenshot)

  16. Progress Report: Format • Progress report focuses on work completed and work remaining • Required components (see assignment for full details!): • Overview – Summarize purpose & content of memo • Background – Recycle & update from Design Plan • Work Completed – Summarize findings w/at least 2 visuals • Work Remaining – Forecast findings, outline work left to do • Conclusion – Reiterate project status, assess chances of success, raise any concerns

  17. Rest of today & Tuesday • Analyze your data • Write progress report • Progress report due at end of class on Tuesday

More Related