1 / 28

Web Content Development

Web Content Development. Dr. Komlodi Class 25: Evaluative testing. Web Design and Evaluation. User, content, context research (Site scope). Information organization (Site map). User-system interaction design (Application flow). Content creation (Content inventory).

tadeo
Download Presentation

Web Content Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web Content Development Dr. Komlodi Class 25: Evaluative testing

  2. Web Design and Evaluation User, content,context research(Site scope) Informationorganization(Site map) User-system interactiondesign (Applicationflow) Content creation(Contentinventory) Graphics designand branding Labeling andnavigationdesign (Wireframes) Evaluate

  3. The aims • Introduction to the goals and methods of user interface evaluation • Practice methods • Focus on: • Usability evaluation • Expert reviews: Heuristic evaluation

  4. The need for evaluation • Usable and useful user interfaces and information architectures need evaluation • Evaluation should not be carried out by designers • Two main types of evaluation • Formative evaluation is done at different stages of development to check that the product meets users’ needs. • Summative evaluation assesses the quality of a finished product. Our focus is on formative evaluation

  5. What to evaluate Iterative design & evaluation is a continuous process that examines: • Early ideas for conceptual model • Early prototypes of the new system • Later, more complete prototypes Designers need to check that they understand users’ requirements.

  6. Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussion about design and evaluation.

  7. When to evaluate • Throughout design • From the first descriptions, sketches etc. of users needs through to the final product • Design proceeds through iterative cycles of ‘design-test-redesign’ • Evaluation is a key ingredient for a successful design.

  8. Design Example Video • Allison Druin et al.: Designing with and for children • http://www.umiacs.umd.edu/~allisond/ • Videos: • Juan Pablo Hourcade, Allison Druin, Lisa Sherman, Benjamin B. Bederson, Glenda Revelle, Dana Campbell, Stacey Ochs & Beth Weinstein (2002) SearchKids: a Digital Library Interface for Young Children. ACM SIGCHI 2002 Conference • Questions: • Who: who are the designers, evaluators, and other participants? • What & how: what evaluation methods are they applying and how are they using these?

  9. Four evaluation paradigms • ‘quick and dirty’ • usability testing • field studies • expert reviews

  10. Quick and dirty • ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. • Quick & dirty evaluations are done any time. • The emphasis is on fast input to the design process rather than carefully documented findings.

  11. Usability testing • Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. • As the users perform these tasks they are watched & recorded on video & their key presses are logged. • This data is used to calculate performance times, identify errors & help explain why the users did what they did. • User satisfaction questionnaires & interviews are used to elicit users’ opinions.

  12. Evaluation • Observation methods • Define typical user tasks • Collect background information: • Demographic questionnaire • Skills questionnaire • Define success metrics • Collect performance and satisfaction data • Do not interfere with user • Think aloud • Prompt: What are you thinking? What are you doing? • But ask follow-up questions on problems • Analyze data • Suggest improvements

  13. Usability Testing Exercise • Teams of three: • Participant • Test administrator • Note-taker • Test the following sites: • USMAI catalog (http://catalog.umd.edu/) • Research Port (http://researchport.umd.edu)

  14. Usability Testing ExerciseProcedure • Whole group: Familiarize yourself with the site, try to figure out the goals and intended user group – the note-taker should take notes • The test administrator and note-taker should read and modify the usability evaluation script, including devising two tasks • Conduct the study • Post your notes and lessons learned about the site and the usability evaluation process

  15. Visit the Usability Lab

  16. Field studies • Field studies are done in natural settings • The aim is to understand what users do naturally and how technology impacts them. • In product design field studies can be used to:- identify opportunities for new technology- determine design requirements - decide how best to introduce new technology- evaluate technology in use.

  17. Expert reviews • Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. • A key feature of predictive evaluation is that users need not be present • Relatively quick & inexpensive • Expert reviews entail one-half day to one week effort, although a lengthy training period may sometimes be required to explain the task domain or operational procedures • There are a variety of expert review methods to chose from: • Heuristic evaluation • Guidelines review • Consistency inspection • Cognitive walkthrough • Formal usability inspection

  18. Expert reviews (cont.) • Expert reviews can be scheduled at several points in the development process when experts are available and when the design team is ready for feedback. • Different experts tend to find different problems in an interface, so 3-5 expert reviewers can be highly productive, as can complementary usability testing. • The dangers with expert reviews are that the experts may not have an adequate understanding of the task domain or user communities. • Even experienced expert reviewers have great difficulty knowing how typical users, especially first-time users will really behave.

  19. Heuristic Evaluation Example • Information visualization tool for intrusion detection • Project sponsored by Department of Defense • Review created by Enrique Stanziola and Azfar Karimullah

  20. Heuristics We developed certain heuristics that were utilized to effectively evaluate the system. We looked at the following criteria: • Match user task with the transitions provided on the interface. • Object grouping based according to their relatedness. • Color Usage – Accessibility evaluation. • Interface provides just enough Information • Speak User’s language. • User’s conceptual Model evaluation • User Memory load (design issues) • Consistency Evaluation • User Feedback • Clearly marked exits • Shortcuts • Constructing error messages • Error Handling • Help and documentation

  21. Findings The rest of this document focuses on the individual findings of each expert user. We report the comments of each user as he completed all the tasks. Expert Reviewer A: • c.1. In the File Menu, user’s language is not used. There is no term like “New” or “ New Session” that would indicate the initial step the user must take to start a session. • c.2. No help is provided. • c.3. Labels in the graph window are too small on the color bar. Font size is not consistent with the font size used in the 3D graph display. • c.4. User Language: ‘Binding’ term used in Menu is hard to understand. Also the window title: ‘dGUI’ could be made more meaningful. • c.5. No keyboard navigation functions available to the user in the data configuration window. • c.6. No clue as to how to select a variable (double clicking) and how to deselect the selected variable. Dragging function not evident to the user. Balloon help could be useful. Buttons next to Visualization attribute list have no label.

  22. Heuristic Evaluation Exercise • Louis Rosenfeld’s IA Heuristics (2004) • Select an area of heuristics: • Main page • Search interface • Search results • Site-wide & Contextual navigation • Evaluate the UMBC library site in light of these • Report your results to the class

  23. Choose the evaluation paradigm & techniques • Goals • Budgets • Participants • Time limits • Context

  24. Evaluating the 1984 OMS • Early tests of printed scenarios & user guides • Early simulations of telephone keypad • An Olympian joined team to provide feedback • Interviews & demos with Olympians outside US • Overseas interface tests with friends and family. • Free coffee and donut tests • Usability tests with 100 participants. • A ‘try to destroy it’ test • Pre-Olympic field-test at an international event • Reliability of the system with heavy traffic

More Related