1 / 51

Recognising Situations in context aware systems using Dempster-Shafer Theory

Recognising Situations in context aware systems using Dempster-Shafer Theory. Dr. Susan McKeever Nov 4 th 2013. Context Aware systems – e.g . Smart home. Sensors in a smart home Situation tracking – what is the user doing? What activity are they undertaking? E.g Monitoring elderly.

june
Download Presentation

Recognising Situations in context aware systems using Dempster-Shafer Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognising Situations in context aware systemsusing Dempster-Shafer Theory Dr. Susan McKeever Nov 4th 2013

  2. Context Aware systems – e.g. Smart home • Sensors in a smart home • Situation tracking – what is the user doing? What activity are they undertaking? • E.g Monitoring elderly

  3. Context Aware systems • Pervasive /ubiquitious /ambient systems – embedded in the environment s • E.g. intelligent homes, location tracking system • They understand their own “context”. • Context-awarenessis the ability to track the state of the environment in order to identify situations • Situationsare human understandable representations of the environment, derived from sensor data

  4. Research focus: e.g .Gator Tech Smart home

  5. Van Kasteren sensored smart home 14 digital sensors For a month: 7 Situations: Preparing breakfast, dinner, drink, leave house, use toilet, take shower, go to bed

  6. Abstracting sensor data to situations Situations Application e.g. elderly alert system Johnis ‘preparing meal’ Is evidence of Abstracted Context John located in Kitchen @ time 12:30 Is abstracted to Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Location sensor reading (X,Y,Z, ID239, 12:30:04)

  7. Situation Recognition Situation(s) occurring at time, t 12:53 preparing breafast Sensor data Situation Recognition (12:53, 0) (2.15,5.04,3.16, 12:34) • Situation recognitionis a critical, continuous, dynamic process – often required in real time. • The recognition process is difficult and uncertain – no single approach suitable for all • Knowledge • Expert? Past data?

  8. Situation Recognition - Scenario Scenario “The person is in the kitchen. It is morning time. They carry out a series of tasks, such as taking cereal out of the groceries cupboard, using the kettle, opening the fridge, and using the toaster” • Human Observer: “preparing breakfast” • Why? • Individual tasks may not confirm that breakfast is in progress, but together, indicate the ’preparing breakfast’ situation. • Morning time • Informative sensors e.g. toaster

  9. Recognising situations – Automated Sensor overlap - Kettle and fridge: ’preparing drink? Gaps of seconds or minutes occuring with no sensor activity – classify? As more tasks are done, system is more certain of ‘preparing breakfast situation’ – Temporal aspect Sensors can breakdown and have error rate – toaster sensor doesn’t fire? The person does not prepare breakfast in the same way every day. The tasks are not necessarily performed in any particular order. Co-occurring situations? (’on telephone’); Cannot o-occur (’user asleep’)? -Valid combinations of situations. Different people “prepare breakfast” in different ways.. Individual efinitions? A second occupant now enters the kitchen – how to distinguish?

  10. Recognising situations – Some approaches • Machine learning techniques, inc. • Bayesian networks • Decision trees • Hidden Markhov models • reliant on training data • Specification based approaches, inc. • Logic approaches • Fuzzy logic • Temporal logic

  11. Problems to be solved (not exhaustive) • How to recognise situations in pervasive environments, allowing for particular challenges: • Uncertainty (sensor data, situation definitions, context fuzziness) • Difficulties in obtaining training data • My solution: Use and enhance evidence theory (Dempster Shafer theory)

  12. Why Dempster Shafer theory • Devised in 1970s • Mathematical theory for combining separate pieces of information (evidence) to calculate the belief in an event. • Applied in military applications, cartography, image processing, expert systems, risk management, robotics and medical diagnosis • Key features: • its ability to specifically quantify and preserve uncertainty • its facility for assigning evidence to combinations • Various researchers applying in pervasive systems

  13. Approach • Apply Dempster Shafer (evidence) theory to situation recognition • Create a network structure to propagate evidence from sensors • Extend the theory to allow for: • New operations needed support evidence processing of situation • Temporal features of situation • Rich (static and dynamic) sensor quality

  14. Dempster Shafer theory: Example Two sensors are used to detect user location in an office. The locations of interest are: (1) Cafe, (2) the user’s desk, (3) the meeting room and (4) ‘lobby’ in the building. Meeting room Café User’s desk Lobby Frame of Discernment ‘hypotheses’ (allows combinations) Evidence sources Each sensors assigns belief as a ‘mass function’ which totals per sensor to 1 Sensor 1 Sensor 2 Any uncertainty is assigned to ‘ignorance’ hypthesis 𝞱– {desk ^ cafe ^ meetingRoom ^ lobby}

  15. Dempster Shafer theory: Example Sensor 1 Detects the user’s location in the cafe. The sensor is 70% reliable, so its belief is assigned across the frame as {cafe 0:7; 0:3 𝞱) Sensor 2 The second sensor has conflicting evidence, assigning {meetingRoom 0:2, desk ^cafe^lobby 0:6, 0:2 𝞱} mass functions To combine evidence source: Use dempster combination rule

  16. Dempster Shafer theory: Combination rule M12 (A) is the combination of two evidence sources or mass functions for a hypotheses A. Denominator is a normalisation factor 1-K where K = conflicting evidence Evidence sources must sum to 1:

  17. Dempster Shafer theory: example Sensor 1 Sensor 2 Combined evidence Conflict (K ) = 0.14 ; All evidence is normalised by 1-K giving: Café 0.65; meeting 0.07; desk/café/lobby 0.21, uncertainty 0.07

  18. Dempster Shafer theory: problems Zadeh’s paradox Conflicting sensor: Appear to agree completely if any agreement – not intuitive

  19. Dempster Shafer theory: problems Single sensor dominance A single sensor can overrule a majority of agreeing sensors if it disagrees: e.G .if 5 sensors determine a user location in a house, a single “categorical” (certain) sensor that assigns all its belief to a contradictory option will negate the evidence from the remaining 4. Kitchen 0.9 Sitting room 1 Kitchen 0.6 Kitchen 0.8 Kitchen 0.7 Sensor 2 Sensor 3 Sensor 4 Sensor5 Sensor 1

  20. Dempster Shafer theory: gaps Timeline Prepare Dinner 40 minutes Plates Cupboard Accessed No support for evidence spread over time. Assumes evidence is all co-occuring but in reality evidencemay be spread over time. e.g. detecting “prepare dinner” situation detected by sensors on cupboards and fridges. Fridge Accessed Pans Cupboard Accessed Freezer Accessed Groceries Cupboard Accessed

  21. Dempster Shafer theory: gaps Only deals with fusing evidence: no “theory” for propogating evidence across other rules in order to recognise situations Limited to just combining n “sources”: Need a set of additional mathemtical operations for propogating evidence Situations Situations Situations Johnis ‘preparing meal’ Is evidence of Abstracted Context Abstracted Context Abstracted Context John located in Kitchen @ time 12:30 Is abstracted to Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Sensor 1, 2, 3 Location sensor reading (X,Y,Z, ID239, 12:30:04)

  22. Dempster Shafer theory: gaps Only deals with fusing evidence: no “theory” for propogating evidence across other rules in order to recognise situations (and a way to capture all this knowledge) Situation Situation situation situation situation Situations Certainty 0.n Certainty 0.n Certainty 0.n Context Value Context Value Context Value Context Value Context Value Context Value Abstracted Context Context Value Sensor sensor sensor sensor Sensor sensor Sensor Level

  23. Recognising situations – Using Dempster Shafer theory • Want an approach that reduces or eliminates reliance on training data. OK (provided we can define mass functions to say what sensor readings mean) • That allows for “uncertainty” OK • That allows temporal information to be included To be added • That allows sensors belief to be propogated (distributed) up into situation hierachies based on “knowledge” rules To be added • That addresses the issue of Zadeh’s paradox and dominant sensors To be added • Ultimately: Develop a full decision making architecture for real time situation recognition (overleaf) To be added • Needed to extend Dempster Shafer theory

  24. Develop a full decision making architecture for real time situation recognition using extended DS theory Knowlege Valid situation combinations Applicati-ons Recognised Situations Belief Distribution Decision Stage Extended DS theory Sensor Readings Prep Breakfast 0.3, Take a shower 0.6 At time t

  25. Knowledge: an interconnected hierarchy of sensor and situations Situation Situation situation situation situation Situations Certainty 0.n Certainty 0.n Certainty 0.n Context Value Context Value Context Value Context Value Context Value Context Value Abstracted Context Context Value Sensor sensor sensor sensor Sensor sensor Sensor Level

  26. VanKasteren e.g. 3 of the situations Get Drink Prepare Breakfast Prepare Dinner <2> <15> <62> 0.4 0.8 0.8 0.8 0.8 0.2 Morning Nighttime Cup Used Plates Used Microwave Used Groceries Used Freezer Used Pans Used Fridge Used Moning Cup Plates Cupboard Microwave Groceries Cupboard Freezer Pans Cupboadr Time Fridge

  27. First : Define a notation for knowledge capture : denoting sensor evidence /context/ situations – Situation DAG Situation Situations Situation Situation Belief distribution < 5> > 10 > Certainty 0.n Context Values Certainty 0.n Certainty 0.n Context Value Context Value Context Value Context Value Context Value Context Value Belief distribution Sensor sensor Discount 0.n Sensors

  28. First : Define a notation for denoting sensor evidence /context/ situations – Situation DAG i.e to capture the knowledge of what sensors indicate what situation is a type of Duration of situation, evidence not in sequence < duration> Sensor, context value orsituation Duration of situation, evidence in sequence >duration > Discount factor applied to a sensor: 0< n <1 Discount 0.n is evidence of Certainty 0.n Certainty applied to an inference rule: 0 < n < 1

  29. Second: Create evidence propogation rules to distribute/propogate belief up to situation level Situation Situation situation situation situation Situations Up to situation certainties here Certainty 0.n Certainty 0.n Certainty 0.n Context Value Context Value Context Value Context Value Context Value Context Value Abstracted Context Context Value Translate Sensor readings into beliefs here .. Sensor sensor sensor sensor Sensor sensor Sensor Level

  30. Second: Create evidence propogation rules to distribute/propogate belief up to situation level Situation Situation situation situation situation Situations Certainty 0.n Certainty 0.n Certainty 0.n Context Value Context Value Context Value Context Value Context Value Context Value Abstracted Context Context Value Sensor sensor sensor sensor Sensor sensor Sensor Level

  31. Second: Create evidence propogation rules to distribute/propogate belief up to situation level: Examples Is a type of: e.g. Situation X is occuring if either Situation Y OR Z is occuring Occupant is “resting” if they are “watching TV” or “in bed” Distributing combined belief across single situations

  32. Second: Create evidence propogation rules to distribute/propogate belief up to situation level: Examples: Sensor Quality Some sensors are inherently lower quality as an evidence source e.g. Calendar sensor is indicative of real calendar owner’s location 70% of the time – Discount (d) evidence from the sensor

  33. Third: Include temporal evidence: Timeline Prepare Dinner 40 minutes Fridge Accessed Grocery Cupboard accessed Plates Cupboard Accessed Freezer Accessed Groceries Cupboard Accessed (1) Use absolute time as evidence (2) Find a way to combine transitory evidence Different Sensors fire intermittently – no single sensor sufficient for situation recognition

  34. Third: extend evidence for duration of situation Prepare Dinner: Time Extended Evidence Time Fridge Accessed Fridge Extended Fridge Extended Fridge Extended Fridge Extended Fridge Extended Groceries Cupboard Accessed Groceries Cupboard Extended Groceries Cupboard Extended Groceries Cupboard Extended Groceries Cupboard Extended Plates Cupboard Accessed Plates Cupboard Extended Plates Cupboard Extended Plates Cupboard Extended Freezer Accessed Freezer Extended Freezer Extended Pans Cupboard Extended Pans Cupboard Accessed Prepare Dinner Starts Prepare Breakfast Ends Situation Duration

  35. Fusing time extended evidence: Adjust Dempster Shafer fusion rules to allow for time extension of evidence Two transitory extended mass functions for hypothesis h with duration t dur, a t time t +t rem

  36. Fourth: Allow for Zadeh’s and Single sensor dominance Two options: Use an alternative combination rule (Murphy’s) which averages out the evidence BEFORE fusing Use a simpler averaging rule to fuse evidence Lacks convergence Removes Zadeh’s problem

  37. Fifth: Combine all this and apply to real world data for situation recogntion Knowlege Valid situation combinations Test our approach using annotated datasets of sensor readings Applicati-ons Recognised Situations Belief Distribution Decision Stage Extended DS theory Sensor Readings Prep Breakfast 0.3, Take a shower 0.6 At time t

  38. Experiments • Data set (1) • “Van Kasteren” • Heavily used by other researchers - compare results on situation recognition • 7 situation annotated, 14 sensors • Data set (2) • “CASL” • Office data set: 3 situations annotated, • Location sensors, • Calendar sensor, • Keyboard sensor

  39. Evaluation Various sub questions also addressed: comparison with published results, comparison of DS fusion rules, impact of quality on situation transitions, quality parameter sensitivity, static versus dynamic quality

  40. Evaluation • 2 annotated published real world datasets – VanKasteren (Smart home) and CASL (office-based) • Situation DAGs created for both datasets • Situation recognition accuracy measured using f-measure of timesliced data sets; • Recognition accuracy using temporal and quality extensions evaluated • J45 Decision Tree and Naive Bayes used for comparison , and published results ; Cross validation used.

  41. Use of DS theory with temporal extensions for situation recognition F-Measure for each situation using DS theory – (1) no time, (2) absolute time, (3) time extended (VanKasteren dataset )

  42. Temporal DS theory compared to two other approches: Naïve Bayes, J48 decision tree. Situations

  43. Our approach compared to the three available published results Same experimental measures * * Excludes timeslices with no sensors firing which are harder to infer – ‘inactive’ Timeslices harder to infer

  44. Use of DS theory with temporal extensions • Use of temporal extensions significantly improves situation accuracy (over baseline DS theory alone) • Performs better than J45, Naive Bayes (particularly with limited training data). This improvement narrows when more training data used (LODO) • Achieves 69% class accuracy in comparison to VanKasteren (49.2%) and Ye*(88.3%)

  45. Use of DS theory with quality extensions F-Measure for each situation using DS theory – with and without quality

  46. Use of DS theory with quality extensions • Use of quality parameters significantly improves situation recognition accuracy (over baseline) • Performance close to Naive Bayes (4%) and J48 (2%) - • Each individual sensor’s quality contributes to improvement • Sensitivity analysis of quality parameters indicates the relative quality of sensors may be important • Time based dynamic quality parameters impact situation transitions – application dependant

  47. Conclusions • Our DS theory is a viable approach to situation recognition: • Not reliant on training data • Incorporates domain knowledge • Caters for uncertainty • Encoding temporal and quality knowledge improves performance over basic DS approachBUT • Knowledge must be available • Different fusion rules appropriate in different scenarios – requires expert “evidence theory” knowledge • Environment changes – no feedback loop for drift • Potentially high computation effort can be reduced

  48. Contributions • A situation recognition approach based on DS theory • Selection of existing and creation of newevidential operations and algorithms to create evidence decision networks • Temporal and quality extensions to DS theory • Diagramming technique to capture structure of evidence for an environment (Situation DAG) • A thorough application, evaluation and analysis of the extended DS theory approach • An analysis of alternative fusion rules

  49. Related Publications • Journal • Journal of Pervasive and Mobile Computing • JAISE Volume 2, Number 2 2010 • International Conferences • EuroSSC Smart Sensing UK 2009 • ICITST Pervasive Services Italy 2008 • International (Peer viewed)Workshops • Pervasive 2010, Helsinki, Finland • CHI 2009 Boston, US • QualConn 2009, Stuttgart, Germany • Pervasive 2009, Sydney, Australia,

  50. Questions?

More Related