1 / 19

Steps in Planning a Usability Test

Steps in Planning a Usability Test. Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario Plan a Test Protocol Create data gathering tools Plan analysis methods. Planning a User Test: User Groups.

bvega
Download Presentation

Steps in Planning a Usability Test

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Steps in Planning a Usability Test • Determine Who We Want To Test • Determine What We Want to Test • Determine Our Test Metrics • Write or Choose our Scenario • Plan a Test Protocol • Create data gathering tools • Plan analysis methods

  2. Planning a User Test: User Groups Your user stories should allow you to identify goals each user group might have when visiting the site. Eg. “prospective members” Goal: As a prospective member, I want to know what the benefits of joining are Using what content? : Info about past events and upcoming events…(what else?)

  3. Planning a User Test: Scenarios • Use the info you have gathered to write a sample scenario, including information about • Who • What • Why • Where But not How!

  4. Try your own scenario! Write one about a typical user for your site. Remember…don’t put “how” info into the scenario. Stick to: who, what, when, why, where…

  5. Determine Who to Test • Consider your design priorities • Think about a representative sample of the group you choose. User Group: Students Target Audience: Those looking for on-campus housing for the coming year User Story: “…I want to choose the place I want to live and my roomate…”

  6. Determine What to Test Based on the user goal, list specific, observable outcomes we can solicit from users in one or more tasks. Example Outcomes: • Starting at home page, locate… • After test, recall… • Submit all information necessary to…

  7. Determine Test Metrics: Performance Criteria Specific Criteria for Success for each Outcome • User locates X piece information and writes it down on test form • User is able to find and download X to the desktop (yes/no) • User is able complete X task in less than 10 minutes, total; (anticipate sources of non-task related delays) Note: Common performance metrics are based ontask success, time, and # of errors

  8. Determine Test Metrics: User Satisfaction Criteria Specific Criteria for Success for each Outcome • User finds the site helpful, well-suited to the task (4 or 5 on a 5 pt. scale) • User finds the site easy to use (4 or 5 on a 5 pt. scale) • Users are confident that they completed the task successfully (4 or 5, etc.) Note: Common satisfaction metrics are based onconfidence of task success, perceived difficulty, and frustration level

  9. Write a Scenario + Tasks • Create a background scenario to orient the participant to their role and goals…can be drawn from your “use case” • Create individual task descriptions that match up with each observable outcome • Sequence tasks so as to avoid interference issues (e.g. learning effects)

  10. Sample task 1. You are considering attending a sailing club event. Use the site to determine the most appropriate upcoming event for learning about the club. For this, we’d want to identify ahead of time the “answers” and decide on some metrics for determining success.

  11. Create a Protocol, 1: Tasks & Preconditions Task 1: Locate X pieces of information Preconditions: Sailing club main page showing; all visited links cleared User may use bookmarks to “collect” pages, etc. Therefore, bookmarks file must be cleared for each new user.

  12. Create a Protocol, 2: Order of Events • Disclaimer & Confidentiality • Pre-Test Questionnaire • User Reads Background info out loud; questions? • User Reads Scenario 1; does task 1; continue until all 3 tasks complete; completes post-task questionnaire • Post-test Interview • Thanks!

  13. Create Test Materials • Disclaimer; thank you note. • Background information sheet on user’s “role” • Scenarios & task sheets with blanks for participant fill-ins • Questionnaire/Interview Questions: pre-test, post-task, post-test • Observation notes sheets

  14. Test Day! Before the test • Remind all of your users of the agreed upon time, place, etc. • Double check the room and equipment • Be sure all test materials are present • Be sure the computer is in the correct beginning state (turn off the screen saver) • Do any pre-test data gathering that is needed

  15. Introduce the test situation • explain the purpose of the test • remind them they aren’t the ones being tested, • explain how results will be used, • remind them of an confidentiality issues, • remind them that their participation is voluntary and that they can stop any time • explain any video or audio taping that will be done • Invite them to ask questions, etc.

  16. Introduce the test activities • give any necessary instructions about the specific tasks users will be doing • remind them to talk aloud if you are using that protocol • give them a way to tell when they are finished with the test • offer them a chance to ask clarification questions • give them clear cues for beginning and ending timed sequences

  17. During the test • depending on the type of data being gathered, you may want to avoid helping during the test • don’t deny giving help that doesn’t affect the data • if a user is getting angry or frustrated you should help them • it’s ok to acknowledge user comments to keep the test going “mmm..hmmm” etc. • appoint a “helper” if possible to escort the user throughout the test

  18. After the test • Do any debriefing, exit interviews or questionnaires • Give users a chance for “open” feedback about the product or the test itself • Thank the users, and pay them if applicable • Label and file all the data from the test as soon as possible afterwards • If you are in the user’s environment, leave it nice and clean!

  19. Analyze Data: formative focus • Aim for a Top 5 Changes format • Group your data using the “User Goals” – for each, summarize performance & satisfaction data; • Give a summary assessment of your design’s performance for each user goal • List significant “stoppers” and major complaints • Rank each goal that has negative data associated with it and each major complaint in order of severity

More Related