1 / 39

LYU0603

This project aims to detect and model facial expressions in real-time using a webcam. The system enhances web meetings and eliminates the need for specific hardware, making it more accessible to users.

oscarf
Download Presentation

LYU0603

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun (05521661) Wong Chi Kin (05524554)

  2. Outline • Project Overview • Motivation • Objective • System Architecture • Face Coordinate Filter • Implementation • Facial expression analysis • Face modelling • Demonstration • Future work • Conclusion

  3. Project Overview • Detect the facial expression • Draw corresponding model

  4. Motivation • Face recognition technology has become more common • Webcam has high resolution enough • Computation power is high enough

  5. Objective • Enrich the functionality of web-cam • Make net-meeting more interesting • Users are not required to pay extra cost on specific hardware • Recognize human face generically

  6. System Architecture

  7. Face Coordinate Filter

  8. Face Coordinate Filter • Our system is based on this filter and built on top of this filter • Input: video source • Output: the coordinate of vertices

  9. Implementation – Face outline

  10. Implementation – Face outline

  11. Implementation - Calibration • Face mesh coordinates => pixel coordinates WHERE

  12. Implementation – Facial Expression Analysis • We assume that the face coordinate filter works properly • Detect eye blinking and mouth opening by coordinate system • With sample movies, record the coordinate changes • Plot graph • Statistic Analysis

  13. Implementation – Facial Expression Analysis • Using vertex pair (33, 41), (34, 40), (35, 39) Part of Face mesh - Eye

  14. Implementation – FacialExpression Analysis

  15. Implementation – Facial Expression Analysis • Using vertex pair (69, 77), (70, 76), (71, 75) – outer bound of lips • Using vertex pair (93, 99), (94, 98), (95, 97) – inner bound of lips Part of Face mesh - Mouth

  16. Implementation – Facial Expression Analysis

  17. Implementation – Facial Expression Analysis • Distance between two vertices < 1 unit • There exists other factors affect the difference • Distance between camera and user • User moves his or her head quickly • for reference

  18. Implementation – Facial Expression Analysis • Three methods • Competitive Area Ratio • Horizontal Eye-Balance • Nearest-Colour Convergence

  19. Competitive Area Ratio To detect whether the mouth is opened or closed

  20. Competitive Area Ratio • We can get the area of the triangle by

  21. Horizontal Eye-Balance • To detect whether the head is inclined

  22. Horizontal Eye-Balance Approach I

  23. Horizontal Eye-Balance Approach I However…

  24. Horizontal Eye-Balance Approach II

  25. Nearest-Colour Convergence • Retrieve pixel colour in the specific area • Treat pixel colour into three different channel (RGB) • Take the average value in each channel

  26. Nearest-Colour Convergence • Colour space difference: • Eye is closed if:

  27. Direct 3D • The characters we will be used in the system modelling

  28. Texture Sampler Mouth Opened Eye Closed

  29. Texture Sampler • Pre-production of image

  30. Texture Sampler • Loading the texture • Mapping object coordinate to texel coordinates

  31. Texture Sampler • Prepare the index buffer

  32. Facial Expression Modelling • Update the object coordinates • Normalize the coordinates geometry

  33. Our System

  34. Demonstration • We are going to play a movie clip which demonstrate our system

  35. Future Work • Improve the preciseness of face detection • Use 3-D model instead of 2-D texture • Allow net-meeting software to use it

  36. Conclusion • We have learnt with DirectShow and Direct3D • We have developed search methods to detect the facial expressions

  37. End • Thank you! • Q&A

More Related