1 / 24

PULSAR P erception U nderstanding L earning S ystems for A ctivity R ecognition

P U L S A R. PULSAR P erception U nderstanding L earning S ystems for A ctivity R ecognition. Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine interaction Multidisciplinary team: Computer vision, artificial intelligence, software engineering.

darius
Download Presentation

PULSAR P erception U nderstanding L earning S ystems for A ctivity R ecognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. P U L S A R PULSAR Perception Understanding Learning Systems for Activity Recognition Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine interaction Multidisciplinary team: Computer vision, artificial intelligence, software engineering

  2. Team presentation 5 Research Scientists: François Bremond (CR1 Inria, HDR) Guillaume Charpiat (CR2 Inria, 15 December 07) Sabine Moisan (CR1 Inria, HDR) Annie Ressouche (CR1 Inria) (team leader) Monique Thonnat (DR1 Inria, HDR) 1 External Collaborator: Jean-Paul Rigault (Prof. UNSA) 1 Post-doc: Sundaram Suresh (PhD Bangalore, ERCIM) 5 Temporary Engineers: B. Boulay (PhD) , E. Corvee (PhD) R. Ma (PhD) , L. Patino (PhD) , V. Valentin 8 PhD Students: B. Binh, N. Kayati, L. Le Thi, M.B. Kaaniche, V. Martin, A.T. Nghiem, N. Zouba, M. Zuniga 1 External visitor: Tomi Raty (VTT Finland)

  3. PULSAR Objective:Cognitive Systems for Activity Recognition Activity recognition: Real-time Semantic Interpretation of Dynamic Scenes Dynamic scenes: • Several interacting human beings, animals or vehicles • Long term activities (hours or days) • Large scale activities in the physical world (located in large space) • Observed by a network of video cameras and sensors Real-time Semantic interpretation: • Real-time analysis of sensor output • Semantic interpretation with a priori knowledge of interesting behaviors

  4. PULSAR Scientific objectives: Objective: Cognitive Systems for Activity Recognition Cognitive systems: perception, understanding and learning systems • Physical object recognition • Activity understanding and learning • System design and evaluation Two complementary research directions: • Scene Understanding for Activity Recognition • Activity Recognition Systems

  5. PULSAR target applications Two application domains: • Safety/security (e.g. airport monitoring) • Healthcare (e.g. assistance to the elderly)

  6. Cognitive Systems for Activity Recognition Airport Apron Monitoring Outdoor scenes with complex interactions between humans, ground vehicles, and aircrafts Aircraft preparation: optional tasks, independent tasks, temporal constraints

  7. Cognitive Systems for Activity Recognition Monitoring Daily Living Activities of Elderly Goal: Increase independence and quality of life: • Enable people to live at home • Delay entrance in nursing home • Relieve family members and caregivers Approach: • Detecting changes in behavior (missing activities, disorder, interruptions, repetitions, inactivity) • Calculate the degree of frailty of elderly people Example of normal activity: Meal preparation (in kitchen) (11h– 12h) Eat (in dinning room) (12h -12h30) Resting, TV watching, (in living room) (13h– 16h) …

  8. Presence sensor Contact sensors to detect “open/close” Water sensor Gerhome laboratory (CSTB,PULSAR)http://gerhome.cstb.fr

  9. From ORIONto PULSAR Orion contributions • 4D semantic approach to Video Understanding • Program supervision approach to Software Reuse • VSIP platform for real-time video understanding  Keeneo start-up • LAMA platform for knowledge-based system design

  10. From ORIONto PULSAR 1)New Research Axis: Software architecture for activity recognition 2)New Application Domain: Healthcare (e.g. assistance to the elderly) 3) New Research Axis: Machine learning for cognitive systems (mixing perception, understanding and learning) 4)New Data Types: Video enriched with other sensors (e.g. contact sensors, ….)

  11. PULSAR research directions Perception for Activity Recognition(F Bremond, G Charpiat, M Thonnat) • Goal:to extractrich physical object description • Difficulty:to obtain real-time performances and robust detections in dynamic and complex situations • Approach: • Perception methods for shape, gesture and trajectory description of multiple objects • Multimodal data fusion from large sensor networks sharing same 3D referential • Formalization of the conditions of use of the perception methods

  12. PULSAR research directions Understanding for Activity Recognition (M Thonnat F Bremond S Moisan) • Goal: physical object activity recognition based on a priori models • Difficulty:vague end-user specifications and numerous observations conditions • Approach: • Perceptual event ontology interfacing the perception and the human operator levels • Friendly activity model formalisms based on this ontology • Real-time activity recognition algorithms handling perceptual features uncertainty and activity model complexity

  13. PULSAR research directions Learning for Activity Recognition (F Bremond, G Charpiat, M Thonnat) • Goal: learning to decrease the effort needed for building activity models • Difficulty:to get meaningful positive and negative samples • Approach: • Automatic perception method selection by performance evaluation and ground truth • Dynamic parameter setting based on context clustering and parameter value optimization • Learning perceptual event concept detectors • Learningthe mapping between basic event concepts and activity models • Learning complex activity models from frequent event patterns

  14. PULSAR research directions Activity Recognition Systems(S Moisan, A Ressouche, J-P Rigault) • Goal:provide new techniques for easy design of effective and efficient activity recognition systems • Difficulty:reusability vs. efficiency  From VSIP library and LAMA platform to AR platform • Approach: • Activity Models: models, languages and tools for all AR tasks • Platform Architecture: design a platform with real time response, parallel and distributed capabilities • System Safeness: adapt state of the art verification & validation techniques for AR system design

  15. Objectives for the next period PULSAR: Scene Understanding for Activity Recognition • Perception:multi-sensor fusion, interest points and mobile regions, shape statistics • Understanding: uncertainty, 4D coherence, ontology for activity recognition • Learning: parameter setting, event detector, video mining PULSAR: Activity Recognition Systems From LAMA platform to ARplatform: • Model extensions:modeling time and scenarios • Architecture:real time response, parallelization, distribution • User-friendliness and safeness of use:theory and tools for component framework, scalability of verification methods

  16. Multimodal Fusion for Monitoring Daily Living Activities of Elderly Meal preparation activity Multimodal recognition Person recognition Resting in living room activity Person recognition 3D Posture recognition

  17. Multimodal Fusion for Monitoring Daily Living Activities of Elderly Resting in living room activity Person recognition 3D Posture recognition

  18. Multimodal Fusion for Monitoring Daily Living Activities of Elderly Meal preparation activity Multimodal recognition Person recognition

  19. Understanding and Learning for Airport Apron Monitoring European project AVITRACK (2004-2006) predefined activities European project COFRIEND (2008-2010) activity learning, dynamic configurations

  20. Activity Recognition Platform Architecture Application level Airport monitoring Vandalism detection Elderly monitoring Configuration and deployment tools Task level Program supervision Object recognition and tracking Scenario recognition Communication and interaction facilities Component level Perception components Understanding components Learning components Usage support tools Ontology management Parser generation Component assembly Simulation & testing Verification

  21. PULSAR Project-team Any Questions?

  22. m1 mk m2 mK Analysis of the similarity between two individuals  i,i’given a variable : Video Data Mining Objective: Knowledge Extraction for video activity monitoring with unsupervised learning techniques. Methods: Trajectory characterization through clustering (SOM) and behaviour analysis of objects with relational analysis1. Self Organizing Maps (SOM) Relational Analysis 1 BENHADDA H., MARCOTORCHINO F., Introduction à la similarité régularisée en analyse relationnelle, revue de statistique, Vol. 46, N°1, pp. 45-69, 1998

  23. 2052trajectories Video Data Mining Results Step 1:Trajectoryclustering (SOM) Trajectory Cluster 9: Walk from north gates to south exit. Step 2: Behaviour Relational Analysis Trajectory Cluster 1: Walk from north doors to vending machines Behavior Cluster 19: Individuals and not Groups buy a ticket at the entrance

  24. Multimodal Fusion for Monitoring Daily Living Activities of Elderly Composite Event (Use_microwave, Physical Objects( (p: Person), (Microwave: Equipment), (Kitchen: Zone)) Components((p_inz: PrimitiveState inside_zone (p, Kitchen)) (open_mw: PrimitiveEvent Open_Microwave (Microwave)) (close_mw: PrimitiveEvent Close_Microwave (Microwave)) ) Constraints((open_mw during p_inz ) (open_mw->StartTime + 10s < close_mw->StartTime) )) Scenario for Meal preparation Detected by contact sensor Detected by video camera

More Related