1 / 12

Evaluation of visualizations

Evaluation of visualizations. Levels: perception and interpetation. Visual/perceptual how well can people perceive the distinctions the visualization intends? Interpretation how well can people interpret the visualization?. Levels: Use. Use in isolation

kalila
Download Presentation

Evaluation of visualizations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LInfoVis Winter 2011 Chris Culy Evaluation of visualizations

  2. LInfoVis Winter 2011 Chris Culy Levels: perception and interpetation Visual/perceptual how well can people perceive the distinctions the visualization intends? Interpretation how well can people interpret the visualization?

  3. LInfoVis Winter 2011 Chris Culy Levels: Use Use in isolation how accurately can people use the visualization for a particular task in isolation? how quickly can people use the visualization for a particular task in isolation? Use as part of a broader goal how accurately can people use the visualization for a particular task as part of a broader goal? how quickly can people use the visualization for a particular task as part of a broader goal?

  4. LInfoVis Winter 2011 Chris Culy Levels: Satisfaction how satisfied are people with the visualization? e.g. easy/hard, "cool", etc. ≠ how well they can use it how useful is the visualization for what they want to do? how well do they "get it"? people may use/prefer a more difficult tool if it's "cool" if they already know it – there's a learning curve for a new tool if it's cheaper

  5. LInfoVis Winter 2011 Chris Culy Goals Formative evaluation Goal is to get specific information to improve the software Done during the development cycle Often informal Summative evaluation Goal is to get general information about how the software performs Done at the end of (a) development cycle Often (more) formal Can also be used to improve the next iteration

  6. LInfoVis Winter 2011 Chris Culy Some principles for experiments There should be a specific purpose for the experiment What are you evaluating? Experiments should be task based The user should be introduced to the software, especially something new The user should not be “interfered with” during the experiment

  7. LInfoVis Winter 2011 Chris Culy Some techniques for experiments "instrumenting" the program tracking clicks, etc, and then trying to analyse the patterns also for tracking speed, accuracy

  8. LInfoVis Winter 2011 Chris Culy Some techniques for experiments "instrumenting" the user various means for perception eye tracking shows where people are paying attention

  9. LInfoVis Winter 2011 Chris Culy Some techniques for experiments Observation with note taking, timing of the user's actions User feedback pre/post session questionnaire "think aloud protocol": user explains what they're doing and why as they're doing it explicit questions during demo (informal) post session interview (“debriefing”)

  10. LInfoVis Winter 2011 Chris Culy Formal vs. informal Formal Several subjects (6+) Experimental design (comparison) Often statistical analysis of results Experiment usually recorded Informal, "guerilla" testing Few subjects (3-4) Quick, informal test to get user feedback on s.t.

  11. LInfoVis Winter 2011 Chris Culy Techniques for deployed software "Bug" reports When is a bug not a bug? Feature requests Interviews (or narratives) with "customers" Case studies (positive = “success stories”) Observation of "customers" using the software

  12. LInfoVis Winter 2011 Chris Culy Interpreting users' reactions Users can only react in terms of what they know/assume. They don't analyse what they do, so their suggestions tend to be too concrete. They often make specific suggestions that aren't necessarily the best solutions, since they don't know what the possibilities are They are honestly trying to be helpful

More Related