1 / 33

Usability & Evaluation in Visualizing Biological Data

This article explores the importance of usability and evaluation in visualizing biological data, discussing common myths and emphasizing the need for user-centric, iterative engineering processes. It also covers requirements analysis, measurement techniques, and the role of insight in visualization. Examples of visualization tools such as Spotfire and GeneSpring are mentioned.

mgilliam
Download Presentation

Usability & Evaluation in Visualizing Biological Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability & Evaluationin Visualizing Biological Data Chris North, Virginia Tech VizBi

  2. Usomics & Evaluationin Visualizing Biological Data Chris North, Virginia Tech VizBi

  3. Myths about Usability • Usability = Voodoo

  4. Science of Usability Phenomenon Engineering Measurement Science Modeling …analogy to biology

  5. 1. Analyze Requirements 4. Evaluate 2. Design 3. Develop Usability Engineering • User-centric • Iterative • Engineering = process to ensure usability goals are met

  6. Myths about Usability • Usability = Voodoo • Usability = Learnability

  7. Myths about Usability • Usability = Voodoo • Usability = Learnability • Usability = Simple task performance

  8. Impact on Cognition Insight gained: Spotfire GeneSpring

  9. Myths about Usability • Usability = Voodoo • Usability = Learnability • Usability = Simple task performance • Usability = Expensive http://www.upassoc.org/usability_resources/usability_in_the_real_world/roi_of_usability.html

  10. 1. Analyze Requirements 4. Evaluate 2. Design 3. Develop Usability Engineering

  11. Requirements Analysis • Goal = understand the user & tasks • Methods: Ethnographic observation, interviews, cognitive task analysis • Challenge: Find the hidden problem behind the apparent problem

  12. Analysts’ Process Pirolli & Card, PARC

  13. Systems Biology Analysis • Beyond read-offs -> Model-based reasoning Mirel, U. Michigan

  14. 1. Analyze Requirements 4. Evaluate 2. Design 3. Develop Usability Engineering

  15. Why Emphasize Evaluation? • Many useful guidelines, but… • Quantity of evidence • Exploit domain knowledge Hunter, Tipney, UC-Denver

  16. Science of Usability Phenomenon Measurement Modeling

  17. Measuring Usability in Visualization Phenomena goal,problemsolving inference,insight perception,interaction system,algorithm visual • frame-rate • capacity • … • realism • data/ink • … • task time • accuracy • … • ? • market • ? 2 kinds of holes Measurements

  18. Time & Accuracy • Controlled Experiments • Benchmark tasks

  19. Results

  20. + Consistent overall + Fast for single node analysis - Slow and inaccurate for expression across graph + Accurate for comparing timepoints p<0.05

  21. Cerebral Munzner, UBC

  22. Insight-based Evaluation • Problem: Current measurements focus on low-level task performance and accuracy • What about Insight? • Idea: Treat tasks as dependent variable • What do users learn from this Visualization? • Realistic scenario, open-ended, think aloud • Insight coding • Information-rich results

  23. Insight? GeneSpring Spotfire Gene expression visualizations HCE Cluster/Treeview TimeSearcher

  24. Cluster- Time- Gene- ViewSearcher HCESpotfireSpring Results Count of insights Total value of insights Average time to first insight(minutes)

  25. Insight Summary

  26. Users’ Estimation Cluster- Time- Gene- ViewSearcher HCESpotfireSpring Total value of insights Users’ estimated insight percentage

  27. Insight Methodology • Difficulties: • Labor intensive • Requires domain expert • Requires motivated subjects • Short training and trial time • Opportunities: • Self reporting data capture • Insight trails over long-term usage – Insight Provenance

  28. Trend towards Longitudinal Evaluation • Multidimensional in-depth long-term case studies (MILCS) • Qualitative, ethnographic • GRID: Study graphics, find features, ranking guides insight, statistics confirm • But: Not replicable, Not comparative Shneiderman, U. Maryland

  29. Onward… • VAST Challenge • Analytic dataset with ground truth • E.g. Goerg, Stasko – JigSaw study • BELIV Workshop – BEyond time and errors: novel evaLuation methods for Information Visualization

  30. Visual Analytics

  31. Embodied Interaction 1) Cognition is situated. 2) Cognition is time-pressured. 3) We off-load cognitive work onto the environment. 4) The environment is part of the cognitive system. 5) Cognition is for action. 6) Off-line cognition is body-based. -- Margaret Wilson, UCSC GigaPixel Display Lab, Virginia Tech Carpendale, U. Calgary

More Related