1 / 14

Mobile Context Inference Using Low-Cost Sensors

Mobile Context Inference Using Low-Cost Sensors. Evan Welbourne, 591hk - 1/28/05. Mobile Context Inference. “Mobile context”: a term that encompasses many types of sub-context computing: connectivity, communication cost, nearby devices

rinah-hicks
Download Presentation

Mobile Context Inference Using Low-Cost Sensors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mobile Context Inference Using Low-Cost Sensors Evan Welbourne, 591hk - 1/28/05

  2. Mobile Context Inference • “Mobile context”: a term that encompasses many types of sub-context • computing: connectivity, communication cost, nearby devices • user: user’s profile, activities, goals, social context • physical: location, sound, other people, traffic conditions • time: time of day, date, context histories • Current focus: a user’s significant places and mode of transit between them • computing: depends on the application, probably nearby APs • user: significance of places, goals during transit • physical: location, elements of surrounding place, mode of transit • time: patterns of residence in, and transit between places; low-level context histories for high-level context inference

  3. Related Work: Significant Places and Transit • Most past work uses GPS • Significant places: loss of signal or clustered over time - Marmasse et. al. - signal lost and then regained w/in radius r - Ashbrook et. al. - cluster loss spots, cluster over time outside - Patterson et. al. - knowledge, lost signal, cluster over time • Transit: speed, schedules, past patterns - Marmasse et. al. - past patterns, speed (recently) - Ashbrook et. al. - past patterns - Patterson et. al. - past patterns, speed, bus schedules • Some work uses GSM cells (Laasonen et. al.) • Significant places: cell clustering • Transit: past patterns of movement

  4. Related Work: Multi-Sensor Mobile Context Systems • Recent projects use multi-sensor platforms in phone form factor • TEA - 2-axis accel, light, temp, touch, IR sensors in a Nokia cell phoneSchmidt et. al. (began at Starlab, now at TecO) • SenSay - 3-axis accel, light, temp, voice, ambient noise; phone, box, 2 mics Siewiorek et. al. • WatchMe - GPS, 3-axis accel, mic; iPaq, big watch, microphone Marmasse et. al. • Trend: cell phones with many sensors, big storage, and fast computation

  5. Technology: Place Lab • Low-cost, indoor-outdoor, privacy-observant location system • Database of (802.11 AP)(lat,lon) and (GSM tower)(lat,lon) mappings • Calculate your current (lat,lon) using your device’s current readings

  6. Technology: Place Extraction with Place Lab • Similar to previous work, uses time-based clustering • Works indoors and outdoors • Use location traces to find significant places • Input: User’s day-to-day location traces • Output: Areas where most time is spent • Use a time-based clustering algorithm - 10 minutes- 45 minutes- 5 hours- 3 hours- 7 hours

  7. Technology: Multimodal Sensor Board (MSB) • 2” x 1” multimodal sensor platform • Capable of running independently and then relaying it to another device • All sensors grouped together as the would be on a small mobile device

  8. Our System • iPaq hx4705 + Nokia 6600 + MSB • iPaq and phone run place lab, MSB to iPaq which does processing/storage • Low-cost sensors found in an everyday devices like cell phones • Could eventually collapse into a cell phone

  9. Initial Experiments: Enhanced Place Extraction • Use other sensors to enhance our place extraction capabilities • “Zoom-in and fingerprint”: • accelerometers suggest moving / not moving • stationary for > k minutes, then “zoom-in” and fingerprint spot • treat this high-resolution fingerprint as a sub-place • Might be able to classify places based on characteristic sensor readings

  10. Initial Experiments: Mode of Transit Inference • Use other sensors to assist in mode of transit inference • 25 – 120 second Place Lab history can give us approximate speed • faster than human speeds  vehicle • moving at human speeds  look at accelerometers: running, walking • Other sensors (e.g. sound) can probably help with other distinctions

  11. MobileCAES • Main software artifact: Mobile context-aware experience sampling program • Gathers user-annotated sensor traces over a long period of time in daily life • iPaq allows checkbox questionnaires and audio notes • phone allows digital photos • Context-aware element allows us to focus on situations of interest • Architecture for supporting quick, simple context-triggers

  12. User Deployment for Data Collection • Next week: • deploying MobileCAES to 3 job coaches at UW CHDD • deploying sensor board to 3 Intel employees (probably) • I’ll use MobileCAES as well

  13. Future Directions: Applications • Studies of prospective memory • place/time-based memory, when and why do we remember? (Sellen et. al.) • design and implementation of a prospective memory aid (Lamming et. al.) • Interruptability • related to memory • use activity-level and place (Siewiorek et. al.) • Infrastructure assist • Use mode of transit to choose a motion model for particle filters • Additional sensors could help with end-user mapping efforts

  14. Future Directions: System • Integrate with other systems at UW • Assist Opportunity Knocks • Augment RFID reminder system • Collapse system into fewer components

More Related