Location and Context Awareness
Download
1 / 73

Location and Context Awareness - PowerPoint PPT Presentation


  • 117 Views
  • Uploaded on

Location and Context Awareness. Dan Siewiorek June 2012. Outline. Distraction Context Aware Computing Location Activity Recognition Applications Research Challenges. Outline. Distraction Context Aware Computing Location Activity Recognition Applications Research Challenges.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Location and Context Awareness' - albany


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Location and Context Awareness

Dan Siewiorek

June 2012


Outline
Outline

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges


Outline1
Outline

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges


Moore s law reigns supreme
Moore’s Law Reigns Supreme

7

PENT

. PRO

10

PENT

.

80486

80860

68040

6

Number of devices

10

80386

SLOPE = 10X INCREASE

IN 7 YEARS

68030

80286

68020

5

10

8086

6801

4

4004

8080

10

6802

6800

(Source: Walt Davis, Motorola)

3

10

1972

1974

1976

1978

1980

1982

1984

1986

1988

1990

1992

1994

1996

Year Samples Introduced

Transistors perProcessor

a


Moore s law reigns supreme1
Moore’s Law Reigns Supreme

Disk Capacity


Moore s law reigns supreme2
Moore’s Law Reigns Supreme

Cost per Megabyte


Glaring exception
Glaring Exception

Human Attention

Adam & Eve

2000 AD



Outline2
Outline

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges


Context Aware Computing

  • Applications that use context to provide task-relevant information and/or services

  • Context is any information that can be used to characterize the situation of an entity (person, place, or physical or computational object)

  • Contextual sensing, adaptation, resource discovery, and augmentation


Context aware service examples
Context Aware Service Examples

  • Primary Services

    • Location

    • Biological measures (heart rate, breathing)

    • Position of limbs

  • Derived Services

    • Difficulty in performing activity

    • Amount of activity for elderly

    • “Is Bob coming to the meeting”

    • Match Making (location, activity, skill level)


Outline3
Outline

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges


Location sensing parameters
Location Sensing Parameters

  • Physical (x,y,z) versus Symbolic (room)

  • Absolute (shared reference grid) vs Relative

  • Localized Location Computation - where calc

  • Object Recognition

  • Accuracy, Precision

    • Distance, Distribution

  • Scale

    • Number of objects per unit infrastructure per time interval

  • Cost




Location sensing approaches
Location Sensing Approaches Hall

  • Triangulation

    • Lateration – multiple distance measurements between known points

    • Angulation – angle or bearing relative to points with know separation

  • Proximity

    • Nearness to known set of points

  • Scene Analysis

    • Identify relationship to know points



Location service architecture comparison
Location Service Architecture Comparison Hall

  • Four main types of location architectures (Centralized Push, Centralized Pull, Distributed Push, Distributed Pull)

  • Location aware messaging as a representative application

  • Quantified data flow requirements and messages for location based application

  • Centralized pull model performs better than distributed for location aware messaging



Location service architecture data rates
Location Service Architecture Data Rates Hall

Comparison of Architectures using Instant Messaging Application Polling/Push Frequency = 1 min ~ 3,000 wireless clients

48 Buddies per User


Systems issues bats
Systems Issues (Bats) Hall

  • Aesthetics

  • Distribution of sensors by space usage

  • Physical/symbolic boundaries

    • Overlap

  • False negatives

    • Not wearing

  • Quiet Zone

    • Not tracked

  • Cycle: user participation decreases  application degrades  reduced incentive to participate 


Outline4
Outline Hall

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges



Model generation variables
Model Generation Variables Hall

  • Window Sizes

    • 4, 6 Seconds

  • Features Extracted

  • Model

    • k nearest neighbors

    • clustering

    • Support Vector Machine (SVM)

  • ‘Leave-one-out’ cross validation for optimization and testing







New sensor types increase range of activity recognition
New Sensor Types Increase Range of Activity Recognition Transformation

Accuracy

Physiological Sensors


Cell Phone Activity Recognition TransformationDeployment


Data quality
Data Quality Transformation

  • In a real-time wireless environment data centric systems are plagued with lost packets and corrupt data.

  • A dynamic sensor architecture can mitigate these problems.


Handling multiple sensors

Centralized Architecture Transformation

Low Bandwidth Architecture

Dynamic Architecture

Handling Multiple Sensors

Decision

Master Device

Master Device

Master Device

Classifier

Feature Extraction

Sensor Device

Sensor Device

Raw Sensor Data

Sensor Device

Aggregation of N sensors takes place between the two devices.


Centralized architecture
Centralized Architecture Transformation

Decision

  • Classifier is static.

    • Requires knowledge of sensors devices available (N).

  • Bandwidth utilization will be higher (~500 B/s per sensor) sending all raw data.

  • Raw Data Aggregator must deal with memory intensive data sets (~1-10 MB)

~4 B/window

Classifier

Master Device

~1 KB/window

Feature Extraction

~1-10 MB/window

Raw Data

Aggregator

~500 B/s

~500 B/s

Raw Sensor Data

Raw Sensor Data

Indicates wireless link

Sensor Device 1

Sensor Device N


Coping with data loss in the centralized architecture
Coping with Data loss in the Centralized Architecture Transformation

Decision

  • Process 1:

    • Buffer data on master device with the Raw Data aggregator.

    • Forward incomplete windows at least 2/3 full to Feature Extraction.

    • Ex: Forward a set of 40 data points when 60 are expected.

~4 B/window

Classifier

Master Device

~ 1KB/window

Feature Extraction

~1-10 MB/window

Raw Data

Aggregator

~500 B/s

~500 B/s

Raw Sensor Data

Raw Sensor Data

Sensor Device 1

Sensor Device N

Indicates wireless link


Coping with data loss in the traditional architecture

Ex: With subject 20 a maximum of 604 windows could be classified on.

Process 2: Classify only on windows in which all sensors have an output.

Coping with Data loss in the Traditional Architecture


Low bandwidth architecture
Low Bandwidth Architecture classified on.

Decision

~4 B/window

  • Lower wireless bandwidth compared to centralized architecture

Classifier

Master Device

~1KB/window

Feature

Aggregator

~200 B/window

~200 B/window

Feature Extraction

Feature Extraction

Sensor Device 1

Sensor Device N

~500 B/sec

~500 B/sec

Indicates wireless link

Raw Sensor Data

Raw Sensor Data


Dynamic architecture

Fuse locally made classifications from multiple sensors. classified on.

N is dynamic.

Confidence information, the probability of each context, is transmitted to Fuser. (~80 B/window)

Dynamic Architecture

Decision

Master Device

~4 B/window

Sensor

Fuser

~80 B/window

~80 B/window

Classifier

Classifier

~200 B/window

~200 B/window

Feature Extraction

Feature Extraction

Sensor Device N

Sensor Device 1

~500 B/sec

~500 B/sec

Raw Sensor Data

Raw Sensor Data

Indicates wireless link


Comparison to centralized architecture
Comparison to Centralized Architecture classified on.

  • Utilizes dynamic sensor set

    • Increased Accuracy

    • No static classifier in a central location.

  • Utilizes heterogeneous algorithms

    • Best techniques can be used on a per device basis to address:

      • Power constraints

      • Computation constraints


Ergobuddy experimental setup
ErgoBuddy – classified on.Experimental Setup

  • 11 Subjects

    • Approximately 22 hours of data total.

  • 7 Sensor Body Locations

    • Ankle, Arm, Back, Handheld, Holster, Lanyard, Wrist

  • 10 Activities

    • Sitting, Standing, Lifting, Walking, Running, Carrying, Sweeping/Mopping, Stairs, Laddering, Carting


Experimental issues
Experimental Issues classified on.

  • 7 Wearable Sensors for activity recognition communicating over Bluetooth.

  • Approximately 10% packet loss per sensor with current implementation.

    • Low Bandwidth Architecture Reliability ~48%

      • 1 packet lost = missed classification

      • Probability one of seven sensors is down @ 90% reliability:

      • 0.9 ^ 7 = 0.48

    • Dynamic Architecture Reliability ~99.99%

      • 1 packet lost = continue with N-1 other sensors.

      • Probability all seven sensors are down @ 90% reliability:

      • 1-(.1 ^ 7) = .999999


Model generation
Model Generation classified on.

  • 2 Final Window Sizes

    • 4, 6 Seconds

  • 7 Models

    • 1 model for each body location

  • 5 Fusion techniques

  • 0%-90% simulated packet loss environments

  • ‘Leave-one-out’ cross validation for optimization and testing


Fuser technique lossless environment
Fuser Technique – classified on.Lossless Environment

Decision

Master Device

Sensor

Fuser

Classifier

Classifier

Feature Extraction

Feature Extraction

Sensor Device N

Sensor Device 1

Raw Sensor Data

Raw Sensor Data


Performance results
Performance Results classified on.

  • With same number of sensors in a lossless environment fusion yields results 2% worse than a model with access to all sensor’s raw data.


Conclusions
Conclusions classified on.

  • In all lossy environments, 10%+, there was better performance using fusion.

    • 35% accuracy increase in an environment with 50% packet loss.

  • The number of sensors can be reduced in low loss environments for power and bandwidth savings.

    • For our experiment 3 sensors was ideal.

  • This technique can also be applied to systems of heterogeneous sensors.


Outline5
Outline classified on.

  • Distraction

  • Context Aware Computing

  • Location

  • Activity Recognition

  • Applications

  • Research Challenges


Context sensing
Context Sensing classified on.

  • Basic context

    • Location

    • Orientation

    • Audio samples from the user’s environment

    • Static data

    • History of user context

  • Multiple sensors can be used to infer user’s intent

    • Wireless Network Card, Digital Compass, Thermometer, Camera


Example applications
Example Applications classified on.

  • Notification

    • Alert a user if they are passing within a certain distance of a task on their to do list.

    • SenSay Context Aware Cell Phone

  • Meeting Reminder

    • Alerts a user if they are in danger of missing a meeting.

  • Activity Recommendation

    • Recommends possible activities/meetings that a user might like to attend based on their interests.

  • Proactive Assistant

    • Answering questions about user’s intent

    • Proactively preparing user’s workspace based on usage patterns and behavior

  • Matchmaking

    • Locating an entity based upon expertise, skills, proximity and/or availability



Technology
Technology classified on.

Location Service

Use multiple sources to calculate location (e.g. wireless access point triangulation, ceiling photo match)

Give applications simple form

Transcoders

Translate data to form useful on device

Manage Network Disconnects

Persistent Proxies for Devices, Users

Allow policies to be set

Remove burden from individual applications


Handy andy architecture
Handy Andy Architecture classified on.

Database

Infrastructure

Login/Logout

Waldo

PhD

Idealink

Stalker

Service

Infrastructure

User Proxy

Device Proxy

Device Proxy

Device Proxy

Speech

Encode/Decode

Device

Itsy

Jornada

Other Devices


Phd features and interaction
PhD Features and Interaction classified on.

  • User’s List:

  • • Items can be added,moved, and removed

  • Only “checked” items appear on the map

  • Description:

  • Information on the currently selected item

  • Dynamic information automatically updated

  • Map:

  • Dynamic information automatically updated

Map Controls: Zoom & Pan


Phd personal help desk
PhD (Personal help Desk) classified on.




Virtusphere http www virtusphere com
Virtusphere classified on.http://www.virtusphere.com

  • 10-foot hollow sphere that rotates freely in any direction according to the user’s steps.

  • Wireless, head-mounted display allows user to walk and run being immersed into virtual environment.

  • User movement replicated in the virtual environment.


System help
System Help classified on.

  • CMU SCS Computing Facilities DB

  • Matchmaking : Expert to Problem

  • Facilities people have certain expertise

  • Users report problems

  • Performs Matchmaking

  • Assigns expert to the problem

  • Gets reply/confirmation from expert


Matchmaking mm
MatchMaking (MM) classified on.

  • Obtain a list of most relevant experts for this problem

  • Find out which of these experts are available and if available, after how much time (mean and variance)

  • Find time to reach location of the problem

  • Using Rules, choose the best expert

    • time (mean, var), expert busy?, expert’s score


Contexts used
Contexts Used classified on.

  • Location of the expert

  • Location of the problem

  • Expert profiles -skills

  • Expert availability, current involvement (busy)

  • Time spent on the problem so far

  • History of maintenance (problems  experts)

  • Other simultaneous problems

  • Time of the day


Major components
Major Components classified on.


Locator@cmu
[email protected] classified on.

  • Implemented on Wireless Andrew

    • ~1,000 APs

    • ~5,000 peak concurrent users

  • Centralized-Pull architecture using relational database

  • Provides omniscient view of network usage



Location based applications
Location-Based Applications classified on.

  • Where are users now and where have they been (past/present)

    • Contact/Spread Tracking

  • Where were users (past)

    • Unknowing Bystander Service

  • Where will users be (future)

    • Crowd Predictor


Applications contact spread tracking
Applications Contact/Spread classified on.Tracking

  • Scenario: Someone is ‘infected’ how many people do they spread the disease too in various situations?

  • Divided wireless users into infecting agents and general users.

  • Selected 10 infecting agents (4 undergrads, 2 grads, 2 faculty, and 2 staff)


Contact spread tracking direct primary
Contact/Spread Tracking classified on.Direct-Primary


Applications unknowing bystander
Applications classified on.Unknowing Bystander

  • Scenario….. An event happens at location X. People nearby might not even be aware but can have valuable information.

  • Event Examples:

    • Crime (Burglary/Theft/Murder, etc.)

    • Lost item, pet, or person

  • Possible Users

    • Police, Homeland Security (Citizen Watch Corps)

    • Individuals

  • Used campus crime data to determine how many network users near area and could be potential witnesses

  • What percent of time would there be a potential witnesses?


Unknowing bystander results

Likelihood of Wireless User Being a Witness classified on.

Unknowing BystanderResults

  • 15 of the 16 crimes had potential witnesses

  • Average value of 12.8 for potential witnesses

  • Median value of 4.5 for potential witnesses

  • Chance of at least 1 witness for 4.5 witnesses with likelihood of 5% (21%); 10% (38%); 33% (83%)


Applications crowd predictor
Applications classified on.Crowd Predictor

  • Use information from historical data to populate an application to predict future crowds at a location (Neural Network)

  • Can be used by organizations to find best spot to setup table

  • Allow for other limited criteria (such as type of space, time of day, day of week)


Applications crowd predictor metrics
Applications classified on.Crowd Predictor - Metrics

Select 12 different Access Points

  • 3 in Classroom Areas

  • 3 in Office Areas

  • 3 in Public Areas

  • 3 in Dorm Areas

  • Predict wireless crowds at 12 test Access Points at 5 different day/time combinations and compare to observed results

  • Look at effect of time of day, day, and type of area predicting


  • Applications crowd predictor1
    Applications classified on.Crowd Predictor


    Outline6
    Outline classified on.

    • Distraction

    • Context Aware Computing

    • Location

    • Activity Recognition

    • Applications

    • Research Challenges


    Context aware computing research challenges
    Context Aware Computing Research Challenges classified on.

    • When and How to Interrupt

    • Privacy of Data

      • What information is collected

      • Who can access information

      • How long information is stored

    • How User Specifies Preferences on Data Availability

    • User Attention

      • Charge Market Value to those demanding attention

      • Combine theories from social science, cognitive science, and economics


    ad