1 / 42

Expanding the Scope: Pushing the Boundaries of HCI

Expanding the Scope: Pushing the Boundaries of HCI. Chibangalore 22 March 2007. Joseph ‘Jofish’ Kaye Information Science Cornell University, Ithaca, NY jofish @ cornell.edu. What I’m not going to tell you. Experience matters! Because you know that Emotion matters! Because you know that

astro
Download Presentation

Expanding the Scope: Pushing the Boundaries of HCI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Expanding the Scope: Pushing the Boundariesof HCI Chibangalore 22 March 2007 Joseph ‘Jofish’ Kaye Information Science Cornell University, Ithaca, NY jofish @ cornell.edu

  2. What I’m not going to tell you • Experience matters! • Because you know that • Emotion matters! • Because you know that • HCI is changing! • Because we all know that • Usability is dead, dead, dead! • Because it’s not; you’ve still got it to get it right first.

  3. What I am going to talk about… • Is that when we’re working with this ‘new HCI’… that recognizes the importance of emotion, of experience, of the intangible but fundamental • … we need new ways of knowing • to do what we do • to know if what we’ve done is right

  4. What I am going to talk about… • … is how we know what we know, and how that changes. • What is the epistemology of HCI? • How did we come to know that the way we know things in HCI is valid? • How does the way we know things have to change when we’re doing different things? • How do we know if new ways of knowing are valid?

  5. What I am (also) going to talk about… • Is how boundary projects • work that is at the margins of our current practice • that may not be recognizable as everyday practice • can inform and change the way we do work everyday

  6. HCI Evaluation: Validity “Methods for establishing validity vary depending on the nature of the contribution. They may involve empirical work in the laboratory or the field, the description of rationales for design decisions and approaches, applications of analytical techniques, or ‘proof of concept’ system implementations” CHI 2007 Website

  7. Evaluation of the VIO • A minimal device for couples in long distance relationships to communicate intimacy • It’s about the experience; it’s not about the task www.intimateobjects.org Kaye, Levitt, Nevins, Golden & Schmidt. Communicating Intimacy One Bit at a Time. Ext. Abs. CHI 2005. Kaye. I just clicked to say I love you. alt.chi, Ext. Abs. CHI 2006.

  8. A Brief History and plan for the talk • Evaluation by Engineers • Evaluation by Computer Scientists • Evaluation by Experimental Psychologists & Cognitive Scientists • Evaluation by HCI Professionals • Evaluation in CSCW (briefly) • Evaluation for Experience

  9. 3 Questions to ask about an era Who are the users? Who are the evaluators? What are the limiting factors?

  10. Evaluation by Engineers Users are engineers & mathematicians Evaluators are engineers The limiting factor is reliability

  11. Evaluation by Computer Scientists Users are programmers Evaluators are programmers The speed of the machine is the limiting factor

  12. Evaluation by Experimental Psychologists& Cognitive Scientists Perceptual issues such as print legibility and motor issues arose in designing displays, keyboards and other input devices… [new interface developments] created opportunities for cognitive psychologists to contribute in such areas as motor learning, concept formation, semantic memory and action. In a sense, this marks the emergence of the distinct discipline of human-computer interaction. (Grudin 2006)

  13. Evaluation by Experimental Psychologists& Cognitive Scientists Users are users: the computer is a tool, not an end result Evaluators are cognitive scientists and experimental psychologists: they’re used to measuring things through experiment The limiting factor is what the human can do

  14. Case Study of Evaluation: Text Editors Roberts & Moran, 1982, 1983. Their methodology for evaluating text editors had three criteria: objectivity thoroughness ease-of-use

  15. Case Study: Text Editors objectivity “implies that the methodology not be biased in favor of any particular editor’s conceptual structure” thoroughness “implies that multiple aspects of editor use be considered” ease-of-use (of the method, not the editor itself) “the methodology should be usable by editor designers, managers of word processing centers, or other nonpsychologists who need this kind of evaluative information but who have limited time and equipment resources”

  16. Case Study: Text Editors objectivity “implies that the methodology not be biased in favor of any particular editor’s conceptual structure” thoroughness “implies that multiple aspects of editor use be considered”. ease-of-use (of the method (not the editor itself), “the methodology should be usable by editor designers, managers of word processing centers, or other nonpsychologists who need this kind of evaluative information but who have limited time and equipment resources.”

  17. Case Study: Text Editors Text editors are the white rats of HCI Thomas Green, 1984, in Grudin, 1990.

  18. Evaluation by HCI Professionals Usability professionals They believe in expertise (e.g. Nielsen 1984) They’ve made a decision to decide to focus on better results, regardless of whether they were experimentally provable or not.

  19. Case Study: The Damaged Merchandise Debate

  20. Damaged Merchandise Setup Early eighties-early nineties: usability evaluation methods (UEMs) - heuristics (Neilsen) - cognitive walkthrough - GOMS - …

  21. Damaged Merchandise Comparison Studies Jeffries, Miller, Wharton and Uyeda (1991) Karat, Campbell and Fiegel (1992) Nielsen (1992) Desuirve, Kondziela, and Atwood (1992) Nielsen and Phillips (1993)

  22. Damaged Merchandise Panel Wayne D. Gray, Panel at CHI’95 Discount or Disservice? Discount Usability Analysis at a Bargain Price or Simply Damaged Merchandise

  23. Damaged Merchandise Paper Wayne D. Gray & Marilyn Salzman Special issue of HCI: Experimental Comparisons of Usability Evaluation Methods

  24. Damaged Merchandise Response Commentary on Damaged Merchandise Karat: experiment in context Jeffries & Miller: real-world Lund & McClelland: practical John: case studies Monk: broad questions Oviatt: field-wide science MacKay: triangulate Newman: simulation & modelling

  25. Damaged Merchandise What’s going on? Gray & Salzman, p19 There is a tradition in the human factors literature of providing advice to practitioners on issues related to, but not investigated in, an experiment. This tradition includes the clear and explicit separation of experiment-basedclaims from experience-based advice. Our complaint is not against experimenters who attempt to offer good advice… the advice may be understood as research findings rather than the researcher’s opinion.

  26. Damaged Merchandise What’s going on? Gray & Salzman, p19 There is a tradition in the human factors literature of providing advice to practitioners on issues related to, but not investigated in, an experiment. This tradition includes the clear and explicit separation of experiment-basedclaims from experience-based advice. Our complaint is not against experimenters who attempt to offer good advice… the advice may be understood as research findings rather than the researcher’s opinion.

  27. Damaged Merchandise Clash of Paradigms Experimental Psychologists & Cognitive Scientists (who believe in experimentation) vs. HCI Professionals (who believe in experience and expertise, even if ‘unprovable’) (and who were trying to present their work in the terms of the dominant paradigm of the field.)

  28. Damaged Merchandise Clash of Paradigms • Important: • It’s not that one of these is right and the other is wrong • It’s not that one of these is the ‘old idea’ and the other the ‘new idea’ • This is example of paradigm clash • It can be hard to get new ideas accepted; it’s important to recognize paradigm clashes when you see them so you can work to solve the real problem.

  29. CSCW Briefly… • CSCW vs. HCI • Not just groups instead of users, but philosophy & approach (ideology?) • Posits that work is member-created, dynamic, and explictly not cognitive, modelable • Follows failure of ‘workplace studies’ to characterize work

  30. Evaluation in CSCW • Ramage, The Learning Way (Ph.D, Lancaster 1999) • No single ‘right’ or wrong • Identify why evaluate here • Determine stakeholders • Observe & analyze • Learn • Note the differences between this kind of approach and more traditional HCI user testing. • Different approach from HCI; separate paradigm.

  31. Experience Focused HCI A possibly emerging sub-field, drawing from traditions and disciplines outside the field Emphasis on the experience, not [just] the task But how to evaluate?

  32. Experience focused HCI:cultural commentators Gaver et. al.: cultural commentators with expertise in their own fields provide multi-layered assessment. Gaver, W. (2007) Cultural Commentators for Polyphonic Assessment. To appear, IJHCI.

  33. Experience focused HCI:Home Health Horoscope • Domestic ubiquitous computing • Privacy-preserving sensors • Wellbeing in the home • Output to encourage reflection • Emphasis on the users’ interpretation • Designing for serendipity • Difficult problem Gaver, Sengers, Kerridge, Kaye & Bowers. Home Health Horoscope. To appear Proc. CHI’07

  34. Experience focused HCI:Virtual Intimate Object (VIO) Cultural probes to provide user-interpreted thick descriptions of use experience Kaye, Levitt, Nevins, Golden & Schmidt. Communicating Intimacy One Bit at a Time. Ext. Abs. CHI 2005.

  35. Experience focused HCIVirtual Intimate Object (VIO) Did it make you feel closer to your partner? I was surprised to see one morning that my partner had actually turned on his computer just to push VIO and then turned it off again YES - We share this experience together, and we use VIO aware that from another part of the world someone was thinking to each other! When VIO became red I feel very happy, because I knew that my boyfriend was clicking on it. So this communication was in a instant. Kaye, J. ‘J.’ I just clicked to say I love you. alt.chi, Ext. Abs. CHI 2006.

  36. Experience focused HCIVirtual Intimate Object (VIO) The color that currently best represents my relationship is… Amber/yellow --> do I proceed w/ caution or speed up to beat the red or slow down anticipating a step Purple - we have a more matured, aged relationship rather than a new, boundless one which would best be described by red. Purple is the more aged, ripened form of red. Yellow! Like a sun, like a summer. I often laugh with Sven especially in those days. Using Vio is really funny and interesting. Kaye, J. ‘J.’ I just clicked to say I love you. alt.chi, Ext. Abs. CHI 2006.

  37. Whereabouts Clock Microsoft Research Cambridge 5 families; Phones, WAC for all Open-ended diaries Weekly visits Interpretation-based Emphasizes location-in-interaction, rather than the technical aspects of location. Sellen, A., Eardley, R., Izadi, S., and Harper, R. The whereabouts clock: early testing of a situated awareness device. Ext. Abs. CHI’06 Brown, Taylor, Izadi, Sellen, & Kaye. Locating Family Values: A Field Trial of the Whereabouts Clock. Under consideration for Ubicomp 2007.

  38. Ambient Ink Display Current iteration Screensaver for handwritten notes Shirley Gaw interviewed 17 tablet users at MS TVP Strong understanding of the role of Tablet PC Screensaver: only 1/17 interested - privacy concerns Failure! But… Tablet PCs: Many difficulties as a PC/laptop. Why use? Impression management Cutting Edge, Courtesy, (And privacy) It’s not about the task; it’s about presentation. Hsieh, G., Wood, K., and Sellen, A. 2006. Peripheral display of digital handwritten notes. Proc. CHI’06 Gaw, Kaye & Wood. Evaluating the Ambient Ink Display. To be submittted, Ubicomp 2007.

  39. Sexual Interactions: CHI’06 Workshop • Why is deception acceptable in porn browsing? (But Jakob says!) • What are the ethical and legal implications of virtual sex and prostitution? • Is mouse/screen/keyboard the right interface for all interactions? • Developing social norms of sex in Second Life Brewer, Kaye, and Wyche, S. Sexual Interactions: Why we should talk about sex in HCI. Ext. Abs. CHI 2006.

  40. Experience-focused HCI These projects, and many others like them… require different ways of knowing and produce different kinds of knowledge. They can’t be evaluated by looking under the usability lamppost.

  41. In summary… Kaye & Sengers. The evolution of evaluation. alt.chi, Proc. CHI’07 Kaye, Boehner, Laaksolahti, & Ståhl. Evaluating Experience-focused HCI. SIG, CHI’07. We need to recognize the ways in which multiple epistemologies, not just the experimental paradigm of science, must inform the hybrid discipline of human-computer interaction if we wish to build systems that support users’ increasingly rich interactions with technology. Doing so requires us to be conscious of how we think about what we know, and recognize the importance of new ways of knowing.

  42. An evolving discussion Shameless plug: alt.chi session, Evaluating Evaluation (CHI, Monday 4:30pm) Evaluating Experience-focused HCI Special Interest Group (CHI, Thursday 9am) Special thanks to Wendy Ju & Terry Winograd. Also thanks to Phoebe Sengers & the Culturally Embedded Computing Group, BostonCHI, Alex Taylor, Ken Wood, Richard Harper, Abi Sellen, Shahram Izadi, Lorna Brown and the CMLG, Microsoft Cambridge, Apala Lahiri Chavan & Eric Schaffer, HFI, CHI Bangalore, CHI Mumbai, BostonCHI, the Cornell S&TS Department, Maria Håkansson & IT University Göteborg, Louise Barkhuus, Barry Brown & University of Glasgow, Mark Blythe & University of York, Andy Warr & The Oxford E-Research Center, Susanne Bødker, Marianne Graves Petersen & The University of Aarhus, Jonathan Grudin, Liam Bannon, Gilbert Cockton, William Newman, Kirsten Boehner, Jeff Hancock, Bill Gaver, Janet Vertesi, Kia Höök, Jarmo Laaksolahti, Anna Ståhl, Helen Jeffries, Paul Dourish, Jen Rode, Peter Wright, Ryan Aipperspach, Bill Buxton, Michael Lynch, Seth ‘Beemer’ McGinnis & Katherine Isbister.

More Related