1 / 41

HCI in Ubiquitous Computing

HCI in Ubiquitous Computing. 양 현 승 AIM (AI & Media) Lab KAIST 전자전산학과 hsyang@cs.kaist.ac.kr http://mind.kaist.ac.kr. Contents. HCI in U-C HCI in U-C Embedding Interaction U-C HCI Researches. Ubiquitous Computing.

madisonh
Download Presentation

HCI in Ubiquitous Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HCI in Ubiquitous Computing 양 현 승 AIM (AI & Media) Lab KAIST 전자전산학과 hsyang@cs.kaist.ac.kr http://mind.kaist.ac.kr

  2. Contents • HCI in U-C • HCI in U-C • Embedding Interaction • U-C HCI Researches KAIST, AIM Lab

  3. Ubiquitous Computing • Ubiquitous computing is the method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user (Mark Weiser, Xerox PARC) KAIST, AIM Lab

  4. Ubiquitous Computing • We are surrounded by computing • Computing and processing is embedded into everyday devices • There are many computers/processors per person • Information access and communication is possible virtually everywhere • Dedicated computing devices – information appliances – are all around us • Devices can be connected and networked • What gets us here? KAIST, AIM Lab

  5. Ubiquitous Computing • Mark Weiser: Computers enter everyday life • Help people with everyday tasks in the office and at home (at any time, any place) A good tool is an invisible tool. By invisible, I mean that the tool does not intrude on your consciousness; you focus on the task, not the tool. [Weiser 94] KAIST, AIM Lab

  6. HCI themes with U-Life • Three past interaction themes: • Natural Interfaces • Context-Aware Interaction • Automated Capture & Access to Life Experiences • New interaction theme proposed: • Everyday Computing KAIST, AIM Lab

  7. Natural Interfaces • Forms of Natural Interfaces • Speech, gestures • handwriting (pen-based/free-form) • Issues Encountered • Need a way to represent information with new interface • Error-prone, even humans can’t perfectly read handwriting KAIST, AIM Lab

  8. Context-Aware Interaction • What is appropriate context to use? • Current – user and location • Future – time, history, other users • How to represent this context? • Incorporate hierarchal info and relations • Truly Ubiquitous? • Limitation of many technologies. KAIST, AIM Lab

  9. Context-Aware Interaction location identity objects KAIST, AIM Lab

  10. Everyday Computing: Things to be Considered • No clear beginning & end to all activities • Interruption is expected • Multiple activities operate concurrently • Time is important discriminator • Associative models of information KAIST, AIM Lab

  11. Embedding Interaction

  12. U-Life ? Web servers Electronic servers Mobile browsers Web browsers KAIST, AIM Lab

  13. Change of UI Paradigm Single Screen-based UI Interact with a number of U-devices (distributed + interconnected) Ubiquitous Computing • Highly personal and mobile appliances • Systems that are integrated in everyday environment KAIST, AIM Lab

  14. User interface in U-C • Requirements • Distribution of UI • All U-devices are distributed. • Implicit HCI • To reduce the need for explicit HCI • To let explicit interfaces virtually disappear into the environment • Awareness of the situation, the environment and the aims of the user • Being noticed only when needed KAIST, AIM Lab

  15. User interface in U-C • Current Interaction • Explicit HCI • By command-line • By direct manipulation using a GUI, gesture, or speech input • Interaction in U-C • Implicit HCI • It allows the computer to interpret the user’s behavior and the surrounding situation and use this information as input KAIST, AIM Lab

  16. What is different from traditional‘HCI’ and ‘HCI in UbiComp’ ? • Output modalities • not just an audio visual channel • all senses! • Input modalities • more than pressing buttons and moving an object in two dimensions • Distribution – physical and conceptual • Magic beyond the screen • … it is a vivid physical relationship KAIST, AIM Lab

  17. Development Process?Research Approach? • Not anymore designing and programming a GUI • Interdisciplinary teams – ethnography, design, CS • It is about creating an experience by • Understanding the interaction and process • Designing and constructing a set of devices and an environment • Implement the human-information interface based on the created devices/environment • Test it yourself • Test it with users … go back an refine the hardware and start again KAIST, AIM Lab

  18. Prototypes Functional prototypes are essential to learn, understand and experience how to interact with the ubiquitous computer • From the idea to knowledge • Prototyping has been central to hallmark research in the area (e.g. ParcTab, ActiveBadge) • Learning occurs when along the prototyping process as well as in use • Evaluation • Functional prototypes are the means for evaluation • “Confronting” real people – already with version 0.001 • Deployment in a living lab environment • Facilitating everyday environments with real users KAIST, AIM Lab

  19. Ubi-Comp Environment is itself the Interface • Everyday objects augmented with sensing • table • chairs • glasses • … • Creating a digital shadow reflecting the interaction KAIST, AIM Lab

  20. Embedding Interaction • Basic technologies for embedding interaction • Sensing technologies • Environmental conditions • Users’ location • Co-location with others • Physiological and emotional state of the user • User goals • User schedules • … • Agent technologies • Combining a multitude of sometimes contradictory inputs to make sense at a higher level • Adopting a system’s output to be appropriate to whatever situation might arise KAIST, AIM Lab

  21. Implicit Interaction (1/2) • Implicit Human-Computer Interaction (iHCI) • iHCI is the interaction of a human with the environment and with artifacts which is aimed to accomplish a goal. Within this process the system acquires implicit inputs from the user and may present implicit output to the user. • Implicit Input • Implicit inputs are actions and behaviour of humans, which are done to achieve a goal and are not primarily regarded as interaction with a computer, but captured, recognized and interpret by a computer system as input. • Implicit Output • Output of a computer that is not directly related to an explicit input and which is seamlessly integrated with the environment and the task of the user. KAIST, AIM Lab

  22. U-C HCI Researches

  23. OXYGEN Project Speech and vision technologies enable us to communicate with Oxygen as if we’re interacting with another person, saving much time and effort KAIST, AIM Lab MIT Media Lab

  24. AwareHome Designing the Interactive Experience • Digital Family Portrait • reconnects geographically distant extended family members by allowing them to remain aware of each other in a non-obtrusive, lightweight manner • What Was I Cooking? • a context-aware system that captures the transient information of recent activities and passively displays them as visual cues. • Gesture Pendant • Gesture Pendant recognizes and then translates gestures into commands for your home appliances AwareHome with human-like perception could improve quality of life for many, especially seniors. KAIST, AIM Lab Georgia Tech.

  25. Easy Living(1) EasyLiving is developing a prototype architecture and technologies for building intelligent environments System Architecture Key features Computer vision for person-tracking and visual user interaction. Multiple sensor modalities combined. Use of a geometric model of the world to provide context. Automatic or semi-automatic sensor calibration and model building. Fine-grained events and adaptation of the user interface. Device-independent communication and data protocols. Ability to extend the system in many ways. persontracking world model room control Person Detector AgentLookup Room Lights Room Control UI Person Detector Person Tracker A/V Media Systems Rules Engine WorldModel Seat Sensors PC Logon Terminal Server authentication Desktop Manager Fingerprint Logon KB/Mouse Redirect KAIST, AIM Lab Microsoft

  26. Easy Living(2) . . . . color depth “Person creation zone” Past locations people patches New sensor measurement Predicted location • Personal Detection • Stereo Processing with commercial software • Background subtraction and person detection • Reports sent to central personal tracker about 7Hz • Personal Tracking • Process each new report from a sensor KAIST, AIM Lab Microsoft

  27. HomeLab • Philips HomeLab • appearance looking and feeling like a regular home • for testing its new home technology prototypes in the most realistic possible way • WWICE • PHENOM • EASY ACCESS • POGO: an interactive game for children • virtual story world interfaced by active tools • Intelligent Personal-Care Environment • based on measurements from the activity monitor and heart rate sensor KAIST, AIM Lab Philips Research

  28. KAIST AIM Lab Research

  29. Role of Wearable Computer in Ubiquitous Computing Environment • It easily acquires personal data (personalization). • It guarantees safety of personal data (privacy). • It enhances user’s interaction with many devices. • It reduces network traffic about personal data transmitting. • It assists us to work (agent). KAIST, AIM Lab

  30. Background Various electronic media will be scattered around us in the near future (ubiquitous computing environments). We will frequently interact with those media. (We will feel much annoyed with this interaction.) A system that assists us in interacting with those media in our daily life is required. • This system should understand a user’s intention or preference. • This system should communicate with various electronic media. KAIST, AIM Lab

  31. Research Objective ♦ To Establish Some Concepts - IEM - IWAS ♦ To Propose an IWAS prototype and IEM prototypes ♦ To demonstrate interaction of IWAS and IEM KAIST, AIM Lab

  32. IEM • Interactive Electronic Media • electronic media in ubiquitous computing environment • that are not only controlled by a user’s command • but that also respond to context or the user’s emotional state . . . Wireless Control IEM KAIST, AIM Lab

  33. IEM • IEM Examples • IEM encapsulated electronic appliances • such as a TV, a video player, a radio, a computer, and etc. • Responsive digital media • interactive media artworks • All objects with embedded computer chips or sensors • an automatic curtain that rises or falls according to a user’s intention or preference • a lamp that intelligently controls the intensity of light according to a user’s emotional state • IEM Features • Wireless control => ultimately, automation (agent system) • Unique ID • Interaction capability KAIST, AIM Lab

  34. IWAS • Intelligent Wearable Assistance System KAIST, AIM Lab

  35. IWAS H/W Design KAIST, AIM Lab

  36. IWAS H/W Design • Self-contained System to Wear • integrating all components of wearable computer with a suit • User-friendly Interface • input: speech recognition, key-pad, mouse, etc. • output: see-through HMD, small speakers. • Various Sensors • FSR and postural sensing unit • Infra-red tag reading unit • Wireless networking • Wireless LAN, IEEE 802.11b KAIST, AIM Lab

  37. Functions of IWAS • Intelligent User-Assistance • local identification using IR sensor device • direct control of IEM using IR remote controller • communication via wireless LAN or Bluetooth • information service such as schedule alert, email check • interacting with media KAIST, AIM Lab

  38. Functions of IWAS PC Audio Player Phone Home G/W Home Network (Home RF, IEEE902.11, ···) UbiComp Environment TV Lamp Control Sensing • IEM identification using IR sensor • wireless control using IR remote controller Intelligent Agent • providing personalized information service KAIST, AIM Lab

  39. IWAS H/W Prototype IR tag reader & IR remote controller See-through HMD with speech head set FSR sensor 3-axis postural sensor IWAS suit KAIST, AIM Lab

  40. Interaction with IWAS and IEM CASE 2: Turning on TV CASE 1: Operating a laptop computer KAIST, AIM Lab

  41. Interaction with IWAS and IEM CASE 3: Controlling virtual system KAIST, AIM Lab

More Related