1 / 22

Evaluating Ubiquitous Computing Applications In Situ

Evaluating Ubiquitous Computing Applications In Situ. Katherine Everitt (UW, IRS intern) Sunny Consolvo (IRS, UW) Ian Smith (IRS) James Landay (IRS, UW CSE) Intel Research Seattle  University of Washington In-Use, In-Situ Workshop  28 October 2005. Talk Overview.

maille
Download Presentation

Evaluating Ubiquitous Computing Applications In Situ

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Ubiquitous Computing Applications In Situ Katherine Everitt (UW, IRS intern) Sunny Consolvo (IRS, UW) Ian Smith (IRS) James Landay (IRS, UW CSE) Intel Research Seattle  University of Washington In-Use, In-Situ Workshop  28 October 2005

  2. Talk Overview • 2 evaluation approaches • Wizard of Oz • Mobile Phones (Existing technologies) • Cross-cutting problem • prototype fidelity • Value of in situ evaluations

  3. In Situ Evaluation Technique:Wizard of Oz • Two examples: • CareNet Display • Home Energy Tutor Configuration Tool Home Energy TutorConfiguration Tool CareNet Display

  4. WoZ Example: CareNet Display • Interactive photo augmented with care-relevant updates • Goal: help local care network members provide day-to-day care

  5. WoZ Example:CareNet DisplaySystem architecture for evaluation

  6. WoZ Example:CareNet Key Results • Improved relationship among care network (qualitative feedback) • Including caregiver-caregiver relationships • Improved quality of conversations with elder • All elders said before deployment that they would share with locals • Distant relations were concerns at start • Bad predictors of who to share with

  7. In Situ Wizard of Oz: Challenges • Data collection was labor intensive • Full-time intern for 3 months! • No exceptions! • Not including development, analysis! • Unreliable technology • GPRS • Difficult to use touch screen • Can’t rely on participants to alert you when the technology fails • no matter how much you beg!

  8. From Fake To Real…

  9. In Situ Evaluation Technique:Mobile Phones (existing technologies) • Houston: Sharing fitness information within a social group • Focus on the social effects • Competition, peer pressure/support • Reno: Sharing location data within a family • Focus on privacy issues • Location system design guidance

  10. Mobile Phone Example:Houston mobile computing + social influence = increased step count Key Decision: Step counts are a reasonable proxy for physical activity

  11. Mobile Phone Example:Houston

  12. Mobile Phone Example:Houston • 3-week, in situ study of Houston • 3 groups of women aged 28 – 42 (13 participants total) • Carry “extra” mobile phone daily • Too complex to “modify” their phone • One group with no “assisted” sharing • Two groups with enhanced sharing • Used first week to set baseline

  13. Houston: Key Results • Wanted credit for all activities • Proper credit within activity • 7 of 13 participants increased daily step count (on average) • Qualitative data suggests that most participants changed their behavior • Sharing motivates some people • Support but not require social interaction

  14. Mobile Phones: Challenges • Input/output of a mobile phone is limited • Exacerbated by development choices • 2nd mobile phone was unnatural • 2nd phone offered us “more control” • Nokia 6600 is considered “too big” by some • Data loss when out of range • Timing of SMS • Charging batteries – custom apps burn batteries faster than participants are used to • Nightly charging isn’t always done

  15. Mobile Phones: Loss of expensive prototype technologies • Theft • Theft from participant • Theft from you by participant • Damage • Pedometers lost to toilet, washing machine, sea, etc. • Phones/PDAs dropped (resulting in cracked screen) • Lost (leave the phone somewhere)

  16. Cross-cutting problem:Fidelity of evaluation prototypes iGlove iBracelet

  17. Value of early-stage in situ evaluations: No pain, no gain. Where would this woman clip a pedometer or carry a large cell phone? The glow of the CareNetDisplay was often distractingfor participants who could see it while watching a movie or trying to sleep

  18. Thank you! • Questions? Comments? • Contact us at: • everitt@cs.washington.edu • [sunny.consolvo, ian.e.smith, james.a.landay]@intel.com

  19. HET config tool pics

  20. Some homeowners’ sensor installations

  21. Homes are not all the same

  22. Someone else’s home

More Related