1 / 38

ISR – Institute of Systems and Robotics University of Coimbra - Portugal

Institute of Systems and Robotics. http://paloma.isr.uc.pt. ISR – Institute of Systems and Robotics University of Coimbra - Portugal. Human-Robot Interaction. Determining face orientation for a robot able to interpret facial expressions Carlos Simplício, José Prado and Jorge Dias

kane
Download Presentation

ISR – Institute of Systems and Robotics University of Coimbra - Portugal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Institute of Systems and Robotics http://paloma.isr.uc.pt ISR – Institute of Systems and Robotics University of Coimbra - Portugal

  2. Human-Robot Interaction Determining face orientation for a robot able to interpret facial expressions Carlos Simplício, José Prado and Jorge Dias Presented by José Prado 2010 03 - 10

  3. Human-Robot Interaction Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  4. Human-Robot Interaction Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  5. Introduction We are developing a service/assistant robot, an Autonomous Mobile Agent (AMA). This agent, will be used in the context of assisted ambiance. The global project addresses the emergent tendencies to develop new devices for assistance and services.

  6. Introduction • Human beings express their emotional states through: • facial expressions • gestures • voice • etc. • We propose: • a technique to determine face orientation based in human face symmetry; • a DBN to classify human facial expressions.

  7. Introduction The AMA must observe and react according facial expressions of a person. Facial expressions recognition becomes easier if done in frontal face images. The robotic system will be used to follow the human being movements and keeps always a frontal face.

  8. Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  9. AMA - architecture

  10. Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  11. Robotic System Controller - RSC • Robotic Platform movements: • Longitudinal translations; • Transversal translations; • Rotations. • Rotations correspond to an arc of circle centered in the human being. • Objective is to follow the rotation done by the human being, getting always an image of a frontal face. • Robotic Head can move in synchronization. 2 1

  12. Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  13. Face Pose Identification System - FPIS In a perfect symmetric image, pixels positioned symmetrically have the same gray-level value: difference is zero. We use this principle to verify if an image is symmetric: frontal face. Example 1 Example 2

  14. Face Pose Identification System - FPIS • In a perfect symmetric image, • pixels positioned symmetrically • have the same gray-level value: difference is zero. • Problems: • By nature, human faces are not perfectly symmetric; • There are shadows. • But it works!!!

  15. Face Pose Identification System - FPIS Define a vertical axis (always in the same position); Calculate differences of gray-levels between symmetric (position) pixels. Build Normalized Gray-level Difference Histogram (NGDH). In a frontal face, the vertical axis bisects the face and the information collected in the NGDH is strongly concentrated near the mean. Else, the information is scattered along the NGDH. NGDH with scattered information NGDH with concentrated information

  16. Face Pose Identification System - FPIS Algorithm: Find and extract face region in the image; Define a vertical axis (dividing the region in two parts with equal number of pixels); Synthesize face images - use vertical axis to perform a 3D transformation (rotation); Synthesized images are “hypotheses” to find the face out-of-plane rotation; Built NGDH's; Find the pseudomean – number of occurrences in a narrow region around the NGDH's mean; Synthesized image with great pseudomean has the frontal face!!

  17. Face Pose Identification System - FPIS

  18. Face Pose Identification System - FPIS Algorithm: Find and extract face region in the image; Define a vertical axis (dividing the region in two parts with equal number of pixels); Synthesize face images - use vertical axis to perform a 3D transformation (rotation); Synthesized images are “hypotheses” to find the face out-of-plane rotation; Built NGDH's; Find the pseudomean – number of occurrences in a narrow region around the NGDH's mean; Synthesized image with great pseudomean has the frontal face!!

  19. Face Pose Identification System - FPIS

  20. Face Pose Identification System - FPIS Algorithm: Find and extract face region in the image; Define a vertical axis (dividing the region in two parts with equal number of pixels); Synthesize face images - use vertical axis to perform a 3D transformation (rotation); Synthesized images are “hypotheses” to find the face out-of-plane rotation; Built NGDH's; Find the pseudomean – number of occurrences in a narrow region around the NGDH's mean; Synthesized image with great pseudomean has the frontal face!!

  21. Face Pose Identification System - FPIS

  22. Face Pose Identification System - FPIS

  23. Face Pose Identification System - FPIS Algorithm: Find and extract face region in the image; Define a vertical axis (dividing the region in two parts with equal number of pixels); Synthesize face images - use vertical axis to perform a 3D transformation (rotation); Synthesized images are “hypotheses” to find the face out-of-plane rotation; Built NGDH's; Find the pseudomean – number of occurrences in a narrow region around the NGDH's mean; Synthesized image with great pseudomean has the frontal face!!

  24. Face Pose Identification System - FPIS

  25. Face Pose Identification System - FPIS Rotation -30º Result -30º Rotation 0º Result 0º Original Angle = 0º Rotation +30º Result +30º

  26. Face Pose Identification System - FPIS Rotation -30º Result -60º Rotation 0º Result -30º Original Angle = -30º Rotation +30º Result 0º

  27. Face Pose Identification System - FPIS Rotation -30º Result 0º Rotation 0º Result +30º Original Angle = +30º Rotation +30º Result +60º

  28. Summary Introduction (Interactive Mobile Robots)‏ Autonomous Mobile Agent (AMA)‏ Robotic System Controller (RSC)‏ Face Pose Identification System (FPIS)‏ Automatic Facial Expressions Recognition System (AFERS)‏ (Structure of a DBN classifying facial expressions)‏

  29. Facial Expressions We only consider five emotional states. Each emotional state has a characteristic facial expression. A facial expression is a set of Action Units (AUs). Paulo José Carlos Carlos Alex AUs are “distortions” of facial features. Ex: lips smile. anger fear happy neutral sad

  30. DBN's Structure

  31. DBN's Structure Level 1 Node (variable) that probabilistically reflect the existence of an Emotional State. • Emotional States considered are: • anger • fear • happy • sad • neutral • other

  32. DBN's Structure Level 2 Nodes (variables) that probabilistically reflect the existence of a facial expression. • Expressions considered are: • anger • fear • happy • sad • neutral ...

  33. DBN's Structure Level 3 11 AUs are considered in each facial expression.

  34. DBN's Structure Level 4 Nodes (variables) that probabilistically reflect the strength of the evidences (positive or negative). ...

  35. DBN's Structure Level 5 Here, information is propagated between time slices. These nodes (variables) combine / fuse probabilistically, through inertia, information coming from the low level in present time slice with that from the previous instant. ... ...

  36. DBN's Structure Level 6 Nodes (variables) collecting the evidences provided by the sensors. ...

  37. Conclusions • It was developed: • An architecture for an Autonomous Mobile Agent; • A Face Orientation Identification Technique; • A structure for a DBN. • The Face Pose Identification Technique has a good performance and is very fast. • Classification of facial expressions using positive and negative evidences is very promising.

  38. END Thanks for your attention!!! Questions ?

More Related