face detection ppt
TRANSCRIPT
Tutorial TopicFacial Expression & Emotion Detection for
Man-Machine Interaction
Two legends of their own..People we can’t forget..
Technology has made them eternal..
Why Emerging Trend???
• The demand for humanoid robots as service robots for everyday life has increased during the last years.
• Detection of emotions which enables the robot to react appropriate to the emotional state of the communication partner.
• Humanoid robots are - as it can be seen in many movies - of great interest. Either as entertainment robots, enduring workers or for the care of elderly people.
• All these possible scenarios have one thing in common: the robot is an accepted member of society and therefore it must behave as a human would do.
Outline-
Facial Expression Estimation Face Detection Facial Feature Extraction Anatomical Constraints – Anthropometry FP Localization FAP Calculation Expression Profiles
Gesture Analysis
Face Detection
Facial Feature Extraction
• Multiple cue Facial Feature Boundary Extraction :– Eyebrows
– Eyes
– Nose
– Mouth
• Each mask is either Edge-based mask or Intensity-based mask.
• Each mask is validated independently.
Multiple Cue Facial Feature Extraction an example-
ANTHROPOMETRY - final mask validation
Facial distances Male/Female
separation measured by the US Army
(30 year period )
The measured distances are normalized by division with Distance 7, i.e. the distance between the inner corners of left and right eye, both points the human cannot move.
DA5n, DA10n: distances in figures normalized by division with distance DA7:
(DA5n=DA5/DA7, DA10n=DA10/DA7)
DAewn: eye width (calculated from DA5 and DA7)
DAewn=((DA5-DA7)/2)/DA7
D5n DA5n_m
in DA5n_m
ax D10n
DA10n_min
DA10n_max
Dew_ln Dew_rn DAewn_
min DAewn_max
2.129 2.517 3.349 0.919 1.031 1.515 0.677 0.452 0.840 1.077
FAP (Facial Animation Parameters)
Discrete features offer a neat, symbolic representation of
expressions
Not constrained to a specific face model Suitable for face cloning applications
MPEG-4 compatible: unified treatment of analysis and
synthesis parts In MMI environments.
FAPs estimation
AU Description
1 Inner brow raiser
2 Outer brow raiser
4 Brow lowerer
10 Upper lip raiser
12 Lip corner puller
15 Lip corner depressor
20 Lip stretcher
24 Lip presser
26 Jaw drop
Detectable action units with feature points
1 1 22
4 4
10
1212 1515
26
2020
24 24
Expression ProfilesEmotion Orignal
DefinitionAdapted Definition
Fear 1+2+4+5+20+25 (1L+1R+2L+2R+20L+20R)/6
Surprise 1+2+5+26 (1L+1R+2L+2R+26+26)/6
Anger 4+5+7+24 (4L+4R+24L+24R)/4
Sadness 1+4+15 (4L+4R+15L+15R)/4
Disgust 4+9+10+17
(4L+4++10)/3
Happiness 6+12+25 (12L+ 12R)/2
Gesture Analysis Gestures too ambiguous to indicate
emotion on their own Gestures are used to support the
confidence outcome of facial expression analysis
Emotion Gesture Class
Joy hand clapping-high frequency
Sadness hands over the head-posture
Angerlift of the hand- high speed
italianate gestures
Fearhands over the head-gesture
italianate gestures
Disgustlift of the hand- low speed
hand clapping-low frequency
Surprise hands over the head-gesture
Emotion analysis system overview
f : Values derived from the calculated distances
G : the value of a corresponding FAP
System Interface
calculated FP distances
rules activated
recognized emotion
Conclusion
• This system is divided into two main parts the feature detection and the emotion interpretation.
• Estimation of a user’s emotional state based on a fuzzy rules architecture.
• Evaluation approach based on anthropometric models and measurements.
Thank You!!