1 / 15

Integrate face tracking into driving simulator COMP6470 Special topic in computing

Integrate face tracking into driving simulator COMP6470 Special topic in computing. Lei Wang. Supervisor: Tom Gedeon. Motivation & Objectives.  Whether most drivers would represent their emotions on their face when they have car crashes. Otherwise,

Download Presentation

Integrate face tracking into driving simulator COMP6470 Special topic in computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrate face tracking into driving simulator COMP6470 Special topic in computing Lei Wang Supervisor: Tom Gedeon

  2. Motivation & Objectives  Whether most drivers would represent their emotions on their face when they have car crashes. Otherwise, integrating face tracking into driving is pointless.  For most young(novice) drivers, whether there is much better correlation between degree of facial expression on the their real emotion state.  Are there any human factors would affect drivers facial performance?  Experience with integration of face tracking tools 2

  3. Experiment on Open Day 3

  4. Results analysis  Find peak points of GSR data and corresponding frames in recorded data automatically in Matlab.  Calculate EmotionOnFaceRate based on Human- observation EmotionOnFaceRate = the number of times when driver have any facial expression in frames/ the number of peak points  Compare different group performance based on Personal information and Questionnaires we collected 4

  5. Find peak points in GSR data via Matlab 5

  6. Select Corresponding Frames from video data 6

  7. Results  Comparison results: Drivers at same age group but with different experience and feeling.  Young (novice) drivers: age <24 [S]  Middle-aged (general) drivers: age 24-40 [M]  Senior (rich experienced) drivers: age >40 [L] Who have driving experience [DE] Who have been involved in road accidents as the driver [RA] Who have used a driving simulator [DS] Age  Experience   Feel like driving a real car [LDRC] Feel excited [F] Successfully avoid car accident in game [ACA]  Feeling   7

  8. All drivers Average EmotionOnFaceRate: 0.22987 8

  9. Young (novice) drivers Average EmotionOnFaceRate : 0.256191 9

  10. Middle-aged (general) drivers Average EmotionOnFaceRate : 0.188863 10

  11. Senior (rich experienced) drivers Average EmotionOnFaceRate : 0.155582 11

  12. Integrate face tracking tools  Face tracking tools:  VOSM (✓)  FACET (✗)  FaceReader (✗)  Face++ (✗)  VOSM(Vision Open Statistical Models, C++):  Face shape fitting based on ASM in image  IMM(images + annotation files)  Build VOSM project properly  In VOSM project reuse the model training function, but edit fitting function to read video and implement face tracking in real-time. 12

  13. Face tracking Demo  Video link 13

  14. Conclusion  For most young(novice) drivers, there is much better correlation between degree of expression on the participant especially they have experience in driving, but also depends on person.  Companions’ words and emotion would affect drivers facial performance. They would responding in same way. Silence – Silence or Exciting – Exciting.  When integrate face tracking tools, the most difficult thing is not only to understand the method they use in theory and edit code to achieve our object, but also build this open source project with hand-on experience.  Base on our results, when we develop Emotion-based car accident warning system, we need to concern the age of driver, the unique and slightly facial expression. 14

  15. Reference [1] Isomursu, Minna, et al. "Experimental evaluation of five methods for collecting emotions in field settings with mobile applications." International Journal of Human-Computer Studies 65.4 (2007): 404-418. [2]Lisetti, Christine Lætitia, and Fatma Nasoz. "Using noninvasive wearable computers to recognize human emotions from physiological signals." EURASIP Journal on Applied Signal Processing 2004 (2004): 1672-1687. [3] Sebe, Nicu, et al. "Authentic facial expression analysis." Image and Vision Computing 25.12 (2007): 1856-1863. [4] Bailenson, Jeremy N., et al. "Real-time classification of evoked emotions using facial feature tracking and physiological responses." International journal of human-computer studies 66.5 (2008): 303-317. 15

More Related