1 / 59

Component-specific usability testing

Component-specific usability testing. Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University (willem.brinkman@brunel.ac.uk). Topics. Introduction Whether and how the usability of components can be tested empirically.

Download Presentation

Component-specific usability testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Component-specific usability testing Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University (willem.brinkman@brunel.ac.uk)

  2. Topics • Introduction • Whether and how the usability of components can be tested empirically. • Testing different versions of component • Testing different components • Whether and how the usability of components can be affected by other components. • Consistency • Memory load

  3. Introduction Component-Based Software Engineering Empirical Usability Testing

  4. Layered communication

  5. Editor Processor 38 15 + 23 15 + 15 38 15 + 23 15 + 15 Layered Protocol Theory(Taylor, 1988) Control results 01111 10111 Add 15 + 23 = 100110 38 Control equation 15 = 23 + User Calculator

  6. Usability Testing Aim to evaluate the usability of a component based on the message exchange between a user and a specific component

  7. Two paradigms • Multiple versions testing paradigm • Single version testing paradigm Manage Create Support Re-use

  8. Test Procedure • Normal procedures of a usability test • User task which requires interaction with components under investigation • Users must complete the task successfully

  9. Component Component-specific component measures Number of messages received The effort users put into the interaction Objective performance Perceived ease-of-use Perceived satisfaction Control process Control loop

  10. y Component-specific component measures Increasing the statistical power Objective performance Perceived ease-of-use Perceived satisfaction y1 = xk+ k y2 = xm+ m k = k component + k rest m = m component + m rest Assumption k restm rest keys messages

  11. Component-specific component measures Component-specific questionnaire increase the statistical power because they help help the users to remember their control experience with a particular interaction component Objective performance Perceived ease-of-use Perceived satisfaction

  12. Component-specific component measures Perceived Usefulness and Ease-of-use questionnaire (David, 1989), 6 questions, e.g. • Learning to operate [name] would be easy for me. • I would find it easy to get [name] to do what I want it to do. Unlikely Likely Objective performance Perceived ease-of-use Perceived satisfaction

  13. Component-specific component measures Post-Study System Usability Questionnaire (Lewis, 1995) • The interface of [name] was pleasant. • I like using the interface of [name]. Strongly Strongly disagree agree Objective performance Perceived ease-of-use Perceived satisfaction

  14. Experimental validation 80 users 8 mobile telephones 3 components were manipulated according to Cognitive Complexity Theory (Kieras & Polson, 1985) • Function Selector • Keypad • Short Text Messages

  15. Architecture Mobile telephone Send Text Message Function Selector Keypad

  16. Experimental validation • Functions Selector • Broad/shallow • Narrow/deep

  17. Repeated-Key Method “L” Modified-Model-Position method “J” Experimental validation Keypad

  18. Experimental validation Simple Send Text Message Complex

  19. Results Average probability that a measure finds a significant (α = 0.05) effect for the usability difference between the two versions of FS, STM, or the Keypad components

  20. Results Wilcoxon Matched-Pairs Signed-Ranks Tests between the number of correct classification made by discriminant analyses on overall and component-specific measures

  21. Topics • Introduction • Whether and how the usability of components can be tested empirically. • Testing different versions of component • Testing different components • Whether and how the usability of components can be affected by other components. • Consistency • Memory load

  22. Two paradigms • Multiple versions testing paradigm • Single version testing paradigm Manage Create Support Re-use

  23. Testing Different Components • Component specific objective performance measure: • Messages received + Weight factor A common currency • Compare with ideal user A common point of reference • Usability of individual components in a single device can be compared with each other and prioritized on potential improvements

  24. Right Mouse Button Menu Properties Assigning weight factors to represent the user’s effort in the case of ideal user {7} Set <Fill colour red, no border> {2} Call <> {1} Click <left on Fill tab> {1} Click <left on on colour red> {1} Click <left on Outline tab> {1} Click <left No Line button> {1} Click <right> {1} Click <left no Ok button> {1} Click <left on Properties option>

  25. Properties Right Mouse Button Menu Total effort value • Total effort =  MRi.W • MRi.W : Message received. Weight factor {2} Call <> {1} Click <left on Fill tab> {1} Click <left on on colour red> {1} Click <left on Outline tab> {1} Click <left No Line button> {1} Click <right> {1} Click <left no Ok button> {1} Click <left on Properties option> 5 + 2 = 7 2

  26. Visual Drawing Objects Properties Right Mouse Button Menu Assigning weight factors in case of real user Correction for inefficiency of higher and lower components

  27. Visual Drawing Objects Properties Right Mouse Button Menu Assigning weight factors in case of real user Inefficiency of lower level components: need more messages to pass on a message upwards than ideally required Assign weight factors as if lower components operate optimal

  28. Visual Drawing Objects Properties Right Mouse Button Menu  MRi.W #MSU ideal UE = #MSU real Assigning weight factors in case of real user Inefficiency of higher level components: more messages are requested than ideally required • UE : User effort • MRi.W : Message received. Weight factor • #MSUreal :Number of messages sent upward by real user • #MSUideal :Number of messages sent upward by ideal user

  29. The total effort an ideal user would make The total effort a real user made The extra effort a real user made Prioritize Ideal User versus Real User • Extra User Effort = User Effort - Total effort Calculate for each component:

  30. Experimental validation 40 users 40 mobile telephones 2 components were manipulated (Keypad only Repeated-Key Method) • Function Selector • Short Text Messages

  31. Results Extra User Effort Mobile phones

  32. Results Partial correlation between extra user effort regarding the two components and other usability measures *p. < .05. **p. < .01.

  33. Comparison with other evaluation methods Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Example: Keystrokes, task duration, overall perceived usability Relatively easy to obtain Unsuitable to evaluate components

  34. Comparison with other evaluation methods Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Based only on lower-level events Pre-processing: selection, abstraction, and re-coding Relation between higher-level component and compound message less direct Components’ status not recorded

  35. Comparison with other evaluation methods Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Help to understand the problem Only looking at error-free task execution Considers the system only at the lowest-level layer

  36. Comparison with other evaluation methods Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Quicker Evaluator effect (reliability)

  37. Topics • Introduction • Whether and how the usability of components can be tested empirically. • Testing different versions of component • Testing different components • Whether and how the usability of components can be affected by other components. • Consistency • Memory load

  38. Consistency problems

  39. Consistency Activation of the wrong mental model

  40. Consistency experiments • 48 Users • Used 3 applications: • 4 Room Thermostats • 4 (2 Web-Enabled TV sets 2 Web Page Layouts) • 4 Applications (2 Timers  2 Application domains)

  41. Within one layer

  42. Within one layer – Experimental Design Day time Temperature Moving Pointer Moving Scale Moving Pointer Night time Temperature Moving Scale

  43. Within on layer - Results

  44. Between layers Web-enable TV set Browser versus Web pages

  45. Between layers - Page Layout Matrix layout List layout

  46. Between layers - Browser

  47. Between layers – Experimental Design Web Page Version List Matrix Linear Browser Plane

  48. Between layers - Results

  49. Application domain

  50. Between Application domain – Experimental Design Application Alarm radio Microwave Mechanical alarm Timer Hot dish

More Related