Ken Hinckley Microsoft Research Collaborators: Jeff Pierce, Mike Sinclair, Eric Horvitz. Sensing & Mobility. Disclaimer: Opinions are Ken’s only and may not reflect the views of Microsoft, or anyone else for that matter . Sense more than just explicit commands
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Background info (presence, sounds, physical contact, …) can be sensed & exploited
Simplify & enhance the user experience via better awareness of contextBackground Sensing: Can it Live Up to its Promise?
Mobile devices used under far more demanding conditions than desktop (e.g while driving)
Even “click on button” difficult due to attentional demands
Provide services/enhancements the user would not have cognitive resources or time to perform.
Device automatically senses what it needs? Push?
Getting text into device? Communication/read-only?
Environs changing, but what properties to sense?
Do you need lots of sensor fusion to do anything useful?
Sensing only useful for small, task-specific devices?Sensing: What’s Unique about Mobile Devices?
What do users expect? Do they like sensing? Do they care?
How to overcome false positives / negatives?
Evaluative / scientific approach to sensing UI’s?
Automatic action vs. user control & overrides
Limits? Where is explicit user input necessary?
“Special cases” & complexity of mobile environments may confuse sensors / override with noise
What are some issues / tradeoffs?
Quick access vs. inadvertent activation of features
Sensor & display quality vs power consumption
cost, weight, features vs. UI complexity, …Background Sensing: Some Open Issues