1 / 14

Sentient recipes

Sentient recipes. First Year Talk. Simon Fothergill DTG Computer Laboratory University of Cambridge. February 2006. Presentation Content. Intent of Ph.D. What I have done The area I have carved out so far Ideas for how to continue What I am working on at the moment Tie up

aidan-king
Download Presentation

Sentient recipes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sentient recipes First Year Talk Simon Fothergill DTG Computer Laboratory University of Cambridge February 2006

  2. Presentation Content • Intent of Ph.D. • What I have done • The area I have carved out so far • Ideas for how to continue • What I am working on at the moment • Tie up • Comments, suggestions, criticisms…

  3. Summary of PhD : Inferring stuff! • Sentient Computing, Context awareness, Sensor fusion • Signal to symbol translation (stepwise, logical and statistical disambiguation) • Extending the Sentient vocabulary • Trying a number of different domains • Location (x,y,z) • Sentient Lecture Theatre (x,y,z, sound, video)

  4. What I have been doing • Experimented with Bat system: • Python programming • Filtering, logging and visualisation of my movements for ~3 weeks • Bat poster • Bat buttons • Analysis of GPS data (inaccessible,not ready to commit!) • Lecture theatre (abandon research proposal! - lighting, other ideas, microphone installation) • Broadband phones (was too complex) • Background to Signal to Symbol translation taken from other fields

  5. Vision • Recipe analogy (mapping for how to create ways of getting information about different phenomena): • Want machine understanding of a phenomena (dish) for better interaction • You need these sensors (ingredients) • Analyse data using these algorithms (combine ingredients according to these instructions) • Stepwise procedure with subparts • Infer results (produce dish) • Plug and play: • Plug in any sensor into the local sensor infrastructure • Possibly need drivers/configuration/calibration phase (possibly long term training) • Possibly specify constraints • Get some “high level” information on what it sensors

  6. Previous work • Signal to Symbol in other domains (NLP, Emotions, Protein folding) • Stepwise, stack-based separation of concerns/levels • Context awareness, context models, middleware infrastructure, programming paradigm, with simple logical examples (Lab assistant) • Robust location systems, fusion of similar sensors, uncertain reasoning of topological information • Not much “hard-core” detailed inference.

  7. Ideas for what to focus on in future • Analysis of relevant verticals (1 at a time) to find the best descriptions of / exact word for, phenomena/features that can be disambiguated – define parameter space • Try and achieve recognition of these lists from building up inferences using raw data from different sensors systems. • For theoreticians/formal reason linguists: • Sensor Metric: Entropy of signal: Examine properties of signal. • What happens to recognition graph when change sensors. • Percentage of time X recognised with probability Y. • Sensor fusion: How to use the data appropriately • Compute P (evidence | witness) • Semantic net: Necessary and sufficient signal properties or data to infer phenomena, for example, a meeting.

  8. Example recognition graph 1 bat 2 bats P(X) Left right forward backwards Leaning

  9. “An” ontology of ontologies (Sentient vocabulary)… Cars Sport Environmental conditions Social insects Pheromones Smell Mood Lighting Speech tracking Ergonomics Theatre Lecture Theatre Alexander Technique Body RSI Sound Meeting Location Posture Movement Gesticulation Activities Leaning Slouching Topography Corridor scale Direction Office scale (running around, energy path) Length Speed

  10. …My current corner • Using the sensors we’ve got, or simple extensions (multiple bats give much more information) • LT • Microphones, video, Ubisense kit • Speech tracking based on lecture notes, dialogue pattern, compression of: He goes THERE, THEN, in THIS way, saying THIS in THIS way. • Interest level, slight lines, obscuration, lines of sight • 1 + 2 Bats • Worn Normally • Worn front and back • On lapels and in pockets • As a ring and wrist band • 3D visualisation of trails. Now machine clustering, classification? • Upwards: Add temporal index and extend uncertainty to beyond sensor system: (Z+, X-)  (forwards, left)  Shapes. • Up 5cms means different things, depending on the history. • Detecting a slouch. Good example: defined application area, extends vocabulary, granularity of sensors support it, enough variation. • Standing, lowering, contact point, relax. • Any user • Beeps if polling rate to low  metric

  11. Previous work on standing/sitting Diagram: Eli Katsiri, Thesis

  12. Titles • Previous title • How to solve the meeting problem • Current title • Sentient recipes • Possible future title • {a specific recipe/type of dish(!)}

  13. People • Andy Hopper • Sean Holden • Rob Harle • Alastair Beresford • Alastair Tse • Bo

  14. Time to chat! • Comments? • Suggestions? • Criticisms? Thank you.

More Related