1 / 67

Lecture #8 SPECIAL METHODS OF TESTING

Lecture #8 SPECIAL METHODS OF TESTING. Y39TUR Spring 2011 Tvorba uživatelského rozhraní. Remote Testing. Software for X. Example: A software needs to be tested for the market in the country of X. Possibilities: Invite 10 people from X to the Czech Republic

roza
Download Presentation

Lecture #8 SPECIAL METHODS OF TESTING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture #8SPECIAL METHODS OF TESTING Y39TUR Spring 2011 Tvorba uživatelského rozhraní

  2. Remote Testing

  3. Software for X • Example: A software needs to be tested for the market in the country of X. • Possibilities: • Invite 10 people from X to the Czech Republic • Air tickets, accommodation, visa • Not their own environment • Go to X • Use a local recruitment agency • Rent a usability lab • Vaccination • Is it always necessary? • Use the remote testing

  4. Traditional Methods vs. Remote Testing • Traditional methods • Participants sit in the lab • Testers physically observe & record • Remote testing • Participants sit in their office/home • Testers observe their screen via a cable & record

  5. Hierarchy of remote testing methods • Classic usability testing • (Up until now) • (Not included in the course) Same place • Remote testing • Teleconferencing • Surveys • On-line evaluation tools • … Different place Same time Different time

  6. Same time, different place

  7. Same time, different place • Synchronous • People connected via teleconferencing MODERATOR STAKEHOLDERS PARTICIPANT

  8. Remote Testing • The testers observe the participants remotely • Via telephone • Via videoconferencing • Via screen capturing and streaming software • Could be a combination of a remote desktop (VNC, …) + a screen grabber (Camtasia, …) • Methodology similar to the one of the classic usability tests • Certain differences

  9. Quality Comparison • Selvaraj & Houck-Whitaker: • Remote tests have at least the same effectiveness as traditional • Benefits • Time and costs savings • You and your participants don’t need to spend time traveling • Realistic context of use • You reach people in their own environment • Geographic representation • Different portions of the globe can be covered • Access to professionals • It’s easier to ask a $500/hr professional to take part in this test because it will claim less of their time

  10. Quality Comparison • Limitations • Lack of nonverbal signs • Communication delay • Low resolution of the video, or perhaps no video link at all • No control over the participant’s conditions • To check the software is well installed • To make sure the participant is not being disturbed • The moderator can’t assist the users on-site • The users are on their own using the system • Higher level of the user IT literacy is expected • Can not test with the novice users

  11. Quality Comparison • Limitations • Will the users trust our application? • People afraid of spyware • Privately owned vs. corporate computers • Will the stakeholders believe that it’s not fake?

  12. Costs Comparison

  13. Remote Testing Overview • Very similar to the “classic usability testing” • Define Objectives & Target Audience • Set up Test Scenario • Recruit Test Users • Carry out Tests • Analyze Findings • Design Report & Brief Stakeholders

  14. Remote Testing Overview • What are you testing? • Who are you testing? • Representative Tasks • Within time-limits & user capabilities • In line with test objectives • Methods of data collection • Screen capture • Questionnaires, interviews

  15. Remote Testing: Test preparation • Consult the objectives with the project stakeholders • Develop instructions for the participants • Run pilot test with home users • Apply changes suggested by the results of the pilot test

  16. Remote Testing: Recruitment • Define user profile & recruitment criteria • Set up recruitment screener • Screener can be filled out on the web • Questionnaires  Database of potential participants • Selection from the database • Telephone screener • Very low success rate (telephone marketing failure) • Decide on incentives

  17. Remote Testing: Recruitment • Recruitment channels • Web • Social networks, mailing lists, job portals • Traditional: Newspapers, ads • With a URL to enter • Recruitment agency • May be important when testing in an unknown market • Perhaps better targeted participants • Web – advanced services • ethnio.com • clicktale.com

  18. ethnio.com – Example of Screener

  19. ethnio.com • Recruiting people directly from a website • Procedure: • Set up a screener at your ethnio.com profile • Set up your website to display the screener • A website visitor will see the screener • If responds, you will be notified immediately • You contact the person by telephone / e-mail

  20. Remote Testing: Recruitment • Specific requirement • The users must be able to install: • The software that is to be tested • The tools used for the test • The task sheet for the participants must be more specific • There is no moderator in their place • Consent solicitation • By voice, saying “Yes, I agree.” • By clicking “I agree” on the screener form

  21. Remote Testing: Technology to use • Teleconferencing • Skype • Screen capture and streaming • VNC • Remote Desktop in MS Windows

  22. Carry out the Test • During the test: • confirm user profile eligibility • ask for permission to record session • limit moderator intrusion • encourage thinking aloud • take notes • deliver incentive/payment • have fun

  23. Analysis & Reports • During tests: track all usability issues • After each test: compare notes & analyze • After all tests: summarize patterns & major problems • Set up report & sample videos • Communicate to all stakeholders

  24. Same place, different time

  25. Same place, different time • Data are physically acquired • Data are picked up later on • Examples: • Customer satisfaction surveys • Elections • Geocashing

  26. Different time, different place

  27. Different time, different place • Asynchronous • Passing messages between the testers and the participants • The whole test can take a considerable amount of time due to delays of communication between the testers and the participants • Testers provide instructions • Through a website / e-mail message • Participants provide data • answering a questionnaire • by monitored interaction with the product • The data are aggregated automatically

  28. Different time, different place • Features • Can be done automatically • Good for quantitative data collection • Good when there are lot of participants (25 – 100) • Drawback • We can’t control the conditions well

  29. Questionnaire-based Testing • Questionnaire • A set of questions • With defined responses ([yes][no], [1][2][3][4][5], …) • Open ended questions • The same questionnaire administered to all participants • Easy to administer • Point to a web form • Send a structured e-mail • Easy to process • Automatic processing of the web forms • Automatic processing of returned e-mails

  30. Questionnaire-based Testing • Not many people respond to questionnaires • Need to “market” the study well • How to aim for specific target group? • Questionnaire should contain some screening questions • Questionnaire contains Screener • Danger of … • … self-selection!

  31. SUMI • Software Usability Measurement Inventory • Measuring software quality from the user’s point of view • “Quality of Use” • Input: • The software or its prototype must exist • 10 users minimum • Output: • Five “grades”: Efficiency, Affect, Helpfulness, Control, Learnability • Based on existing database of gathered questionnaires • Kept by the authors of SUMI

  32. SUMI: Use • How can be used: • Assess new products during product evaluation • Make comparisons between products or versions of products • Set targets for future application developments • Able to test verifiable goals for quality of use • Track achievement of targets during product development • In a quantitative manner • Source: http://www.ucc.ie/hfrg/questionnaires/sumi/whatis.html

  33. SUMI: Scales • Efficiency • Tasks are completed by the user in a direct and timely manner • Affect • How much the product captures emotional responses • Helpfulness • The product seems to assist the user • Control • Users feels that they set the pace, not the product (they are in control) • Learnability • Ease with which the user can learn using the software and/or new features

  34. SUMI: Questionnaire • 50 fixed and predefined questions, such as: • “This software responds too slowly to inputs” • “I would recommend this software to my colleagues” • “The instructions and prompts are helpful” • “I sometimes wonder if I am using the right command” • “I think this software is consistent” • Responses to these questions: • “Yes” • “No” • “Undecided”

  35. SUMI: Processing • The assignment between questions and scales is not disclosed. • SUMI is a commercial service • Know-how of the authors • Procedure: • Participants try the tested system • SUMI questionnaires administered to the participants by testers • Responses to the questionnaires sent to the authors of SUMI • e-mail, web form, … • Testers receive the grades from SUMI • A nominal fee (hundreds USD)

  36. Reference Value SUMI: Example Evaluation

  37. SUMI: Evaluation • The data provided are with respect to the corpus of previously gathered data • The values show the usability of the system compared to the reference score (50 in each scale) • The data can be used to compare two different systems • Better score vs. worse score

  38. SUMI: Evaluation • Enough to provide an unbiased and objective results? • YES • Enough to give insights into particular problems? • NO … we only have 5 numbers as an output • We know nothing about the sources of errors

  39. Automated User Testing

  40. Automating Usability Testing Usability Testing a prototype or the final application is provided to a set of users and the evaluator collect and analyze usage data can be based on a set of predetermined tasks What can be automated in such method? capture of usage data analysis based on predefined metrics or a model Usability Evaluation of: navigation functionalities 40 Federico M. Facca

  41. Capturing Data Information easy to record but difficult to interpret (e.g., keystrokes) meaningful but difficult to label correctly (e.g., when a task can be considered completed?) Method Type: Performance logging (e.g. events and time of occurrence, no evaluator) Remote testing (e.g. assigned task performed by user and monitored by evaluators) 41 Federico M. Facca

  42. Capturing Data – the Web – Server-side Logging Web Server commonly log each user request to the server Available information is: IP address, request time, requested page, referrer We can derive: Number of visitors Breakdown by countries Coverage by robots … 42 Federico M. Facca

  43. Server-side Logging • Pro • huge quantity of easily available data • do not require “ideal” users • Typical questions that we can answer: • “Which contents is interesting?” • “Do people reach all contents?” • “Is all contents necessary?” … which is not the same as: • “Is the navigation good?” • “Does the new design keep people longer on site?” • “Does the new design make people buy more?”

  44. Server-side Loggin • Disadvantages: • Highly quantitative method • Almost no data of exact user interaction with the interface

  45. Client Side Logging Dedicated tools and settings The web client must be enhanced to log information on interaction The client pushes information into a repository on the testers’ server Available information is: IP address, request time, requested page, referring page, mouse position on the screen, clicked links, back button… Pro actual data of exact user interaction with the interface session are automatically reconstructed Against: The participant must use this enhanced browser. 45 Federico M. Facca

  46. Tools “Formal” Client Side User Tracking/Analysis Commercial tools ETHNIO (http://www.ethnio.com/) Ulog/Observer (http://www.noldus.com) UserZoom (http://www.userzoom.com) ClickTale (http://www.clicktale.com/) Usabilla (http://www.usabilla.com) Nielsen Eye Tracking (example in the next slides) Other tools (some are a bit old) WebQuilt (http://guir.berkeley.edu/projects/webquilt/) SCONE/TEA (http://www.scone.de/docus.html) NIST WebMetrics (http://zing.ncsl.nist.gov/WebTools/, not only for tracking and relative analysis) 46 Federico M. Facca

  47. Tools “Informal” Client Side Tracking/Analysis Commercial Tools Google Analytics (http://www.google.com/analytics/) Fireclick (http://www.fireclick.com/) SiteCatalyst (http://www.omniture.com/products/web_analytics) Hitslink (http://os.hitslink.com/) Crazy Egg (http://crazyegg.com) nice example Usabilla (http://usabilla.com) …. tons really Free Tools Search with Google you can find some Server side analysis Again tons of solutions! 47 Federico M. Facca

  48. usabilla.com • A Web 2.0 application • Main principle • Testers present a website screenshot • Participants mark points on the screenshots according to tasks, e.g.: • “Click on the element that you would remove from the page.” • Comments can be added • The testers can see the results in an aggregate form • Individual points and comments are anonymous

  49. Task: “Click on the element that you found most interesting.” Responses from the user TUR 2010

  50. usabilla.com: Example • Object of the test • Czech version of the home page of the DCGI website • Participants • CTU students • Tasks • Click on the element that you found most interesting • Click on the elements that you like most • Click on the elements that you would remove from the page • Where would you look for contact information? • Where would you look for CVs of the faculty members?

More Related