1 / 44

Understanding and Evaluating the User Experience with Information Spaces

Understanding and Evaluating the User Experience with Information Spaces. Andrew Dillon HCI Lab Indiana University adillon@indiana.edu. Why does user experience matter?.

esperanza
Download Presentation

Understanding and Evaluating the User Experience with Information Spaces

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding and Evaluating the User Experience with InformationSpaces Andrew Dillon HCI Lab Indiana University adillon@indiana.edu

  2. Why does user experience matter? “The improvements in performance gained through usable interface design are 3 or 4 times larger than those gained through designing better search algorithms” Sue Dumais, Microsoft - invited presentation to IU’s Computer Science Horizon Day, March 2000.

  3. Why do we need to test users? • Bailey (1993) asked 81 designers to assess 4 interfaces for users like themselves Interface Rating Performance A 4 1 B 3 2 C 1 3 D 2 4 NB: 95% of designers selected an interface other than the one they performed best on.

  4. User Task Tool So what to test?Interaction Basics Context

  5. Basic user tendencies: • Users don’t estimate own performance well • Users change over time • Are impatient • See things in their own way • Seek to minimize cognitive effort

  6. Traditional approach: usability engineering • Usability defined: • Semantically • Featurally • Operationally

  7. So what is usability? • Semantic definitions • ‘user-friendliness’? • ‘ease-of-use’? • ‘ease-of-learning’? • ‘transparency’ • These tend to circularity, and provide little value to design practice • However, the term captures something that people recognize as important

  8. Usability as a collection of features • Interface is usable if: • Links, search engine, nav bar, back button? • Graphical user interfaces (GUI) • Based on style guide recommendations? • Meets Nielsen’s or Shneiderman’s principles of design?

  9. Attribution Fallacy • The attribution fallacy suggests usability is a quality of an interface that is determined by the presence or absence of specific interface features. • This attribution leads to an over-reliance on guidelines and prescriptive rules for design

  10. Experience requires more than features • Users’ experience is contextually determined by their needs, their tasks, their history and their location. • Understanding this and knowing how to evaluate experience, is the primary purpose of this talk

  11. Operational definition Usability (of an application) refers to the effectiveness, efficiency, and satisfaction with which specified users can achieve specified goals in particular environments ISO Ergonomics requirements, ISO 9241 part 11: Guidance on usability specification and measures. Useful but overlooked, and still not the full story….

  12. Effectiveness The extent to which users can achieve their task goals. Effectiveness measures the degree of accuracy and/or completion e.g.,if desired task goal is to locate information on a web site then: Effectiveness= success of user in locating the correct data

  13. Effectiveness can be a scale or an absolute value • If the outcome is ALL or NOTHING then effectiveness is an absolute value -User either locates info or does not... • If outcome can be graded, (user can be partially right) then effectiveness should be measured as a scale -As a %, or a score from 1 (poor) to 5 (complete) • Scale should be determined by evaluator in conjunction with developers and users

  14. Quality? • Some tasks do not have a definitive correct answer: • creative production (writing, design) • information retrieval • data analysis • management • Making a purchase….. • Effectiveness alone misses something...

  15. Efficiency • Measures resources used to perform task • i.e., time, effort, cost, • In case of Web site use, efficiency might equal time taken to complete a task or the navigation path followed etc.

  16. Efficiency of using a redesigned web site • Time taken to complete task • Compared across tasks, across users or against a benchmark score • Number of steps taken • Number of deviations from ideal path Such variables are frequently highly positively correlated - but they needn’t be.

  17. Efficiency in path analysis Ideal path: 3 steps

  18. Efficiency in path analysis Actual to ideal user navigation - 7:3 steps

  19. But is it efficiency that users want? • The push to efficiency is symptomatic of an engineering-oriented approach • Who determines efficiency? • Are path deviations always inefficient? • Is time equally weighted by user, designer or owner? • Suggests a need for negotiation beyond typical usability tests

  20. Satisfaction • Measures the affective reaction (likes, dislikes, attitudinal response) of users to the application • Assumed to be influenced but not the same as effectiveness or efficiency e.g., • 2 applications with equal effectiveness, and efficiency, may not be equally satisfying to use • or What users like might not be what they need!

  21. Basis for satisfaction? • Positively influenced by effectiveness and efficiency • Also • Personal experience with other technologies? • Working style? • Manner of introduction? • Personality of user? • Aesthetics of product?

  22. Satisfaction is important • Good usability studies recognize this But satisfaction is not enough…. • People often like what they don’t use well • What about empowerment, challenge etc?

  23. Beyond usability: P-O-A • User experience can be thought of at three levels: • Process • Outcome • Affect • Full evaluation needs to cover these bases

  24. Experiencing IT at 3 levels: • What user does • What user attains • How user feels

  25. Process: what the user does • Navigation paths taken • Use of back button or links • Use of menus, help, etc. • Focus of attention The emphasis is on tracking the user’s moves and attention through the information space

  26. Outcome: what the user attains • What constitutes the end of the interaction? • Purchase made? • Details submitted? • Information located? The emphasis is on observing what it means for a user to feel accomplishment or closure

  27. Affect: how the user feels • Beyond satisfaction, we need to know if user feels: • Empowered? • Annoyed, frustrated? • Enriched? • Unsure or wary? • Confident? • Willing to come back? The emphasis is on identifying what the interaction means for the user

  28. User experience = behavior +result +emotion Behavior Result Emotion

  29. Interesting ‘new’ measures of UE • Aesthetics, • Perceived usability • Cognitive effort, • Perception of information shapes • Acceptance level • Self-efficacy UE proposes a range of measures not normally associated with usability testing

  30. Aesthetics and user performance -Dillon and Black (2000) • Took 7 interface designs with known user performance data • Asked 15 similar users to rate “aesthetics” and “likely usability” of each alternative design • Compared ratings with performance data

  31. Rankings of 7 interfaces R=.85 R=.83 Correlation between aesthetics and performance = 0

  32. Follow up study: • 30 users • Rated the aesthetics, likely usability and then used 4 web search interfaces • Rated aesthetics and usability again again • No correlation with performance!

  33. So what? • Users respond to interface beauty • Users do not predict their own performance (process and outcome) accurately • Designers cannot usefully predict user response through introspection, theory or asking their colleagues!

  34. Time matters...Error Scores for Regular Users of Software Trial days So design stops being important?

  35. NO…it remains important….

  36. So what? • User experience is dynamic - • Most evaluations miss this • User data is the best indicator of interaction quality….REPEAT THIS TO SELF DAILY!!!!! • To be valid and reliable, the user data must reflect all aspects of the user experience: • P-O-A • The targets are moving….user experience is growing daily in web environments

  37. Genres in information space • Users have expectations of information spaces • Documents have genres • E-business is “shopping” • A website is a website is website…. • Expectations activate mental models which drive what users see and interpret

  38. What does a home page look like? Dillon and Gushrowski (2000) • We analyzed a sample of 100 home pages for features • Then tested 8 candidate pages manipulating the most common or uncommon features of existing pages • New users were asked to rate the pages they thought were ‘best’ • Significant positive correlation resulted

  39. Number reflects user ranking Page # Reflects features Correlation between features and user ranking r=0.95, d.f.=6, p<.01

  40. Implications • Expectations for digital information spaces are forming quickly • Violation of expectancy impacts initial user ratings • Full report online at: • http://memex.lib.indiana.edu/adillon/genre.html

  41. Maximizing your evaluations: • Measure the 3 aspects of UE • Process, Outcome and Response • Design user tests that capture multiple sources of data • Protocols, screen capture, attitude, speed, free-form answers • Don’t rely on gurus or guidelines! • A little data goes a long way!

  42. Example web site protocol (User guesses) 1.32:“What do I choose here?....looks like there is no direct link....and I don’t like the colors here, too bright...er.... (SELECTS TEACHING).. 1.39: ‘teaching and courses’ sounds right (SCREEN CHANGES).. 1.41: “oh this is all about missions and stuff...hang on.... (HITS BACK BUTTON) 1.48: “well.....that looks the best of these, you know.” (Negative comments) (Navigation strategy)

  43. Biggest user complaints in our lab • Poor content • Slow loading • Poor aesthetics • Unclear menu options • Menus with example sub-items much preferred and lead to more efficient use • Too much clicking and “forced” navigation • No site map • Poor search facilities

More Related