human emotion recogniition system

Post on 12-Feb-2017

300 Views

Category:

Engineering

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

P R E S E N T E D B Y

SOUMI SARKAR

B.TECH CSE

ROLL NO.-91/CSE/131036

HUMAN EMOTION RECOGNITION SYSTEM

Human emotion detection is a very challenging field that target methods to make effective human computer interaction. We know that emotions play a major role in a Human life. At different kind of moments or time human face reflects that how he/she feels or in which mood he/she is.

Here we use an existing simulator which will be able to capture human emotions by reading or comparing facial expressions.

INTRODUCTION

• Emotion is soften intertwined with mood, temperament, personality, disposition, and motivation. Human emotions help us cope with everyday life, allowing us to communicate what we feel toward certain situations, people, things, thoughts, senses, dreams, and memories. 

• Many psychologists believe that there are six main types of emotions, also called basic emotions. They are happiness, anger, fear, sadness, disgust, and surprise.

HUMAN EMOTIONS

• To analyze the limitations with existing system Emotion recognition using brain activity.

• Preprocessing and resize the image.

• To detect the edge and reduce the size.

• To extract the feature of the face.

• Find the difference between the input image and the certified images (stored in knowledge base).

• Recognition of emotions is based on the calculation of distances between various features points.

OBJECTIVE

DESCRPTION OF THE TECHNIQUE In this emotion recognition system there are various different techniques. Typically, an automated face expression recognition system includes a camera for capturing the facial image. It is then preprocessed so as to minimize the environmental and other variations in the image. This includes the operations of image scaling and brightness adjustment. After that face ,mouth and eye region was detected i.e. feature extraction. Then with the help of eyes and lips feature we classify five different emotions.

HOW EMOTION RECOGNITION SYSTEM WORKS:

INPUT IMAGEIMAGE

PROCESSING AND RESIZE

EDGE DETECTION

FACE DETECTIONEMOTION

RECOGNITION

DISTANCE MESUREMENT

EMOTION RECOGNITION

FEATURE EXTRACTION

Knowledge Base It contains certified images which we will use for comparisons for the sake of emotion recognition. These images are highly qualified and these are stored in given database.

Pre-processing and resize The image pre-processing procedure is a very important step in the facial expression recognition task. The aim of the pre-processing phase is to obtain images which have normalized intensity, uniform size and shape.

Color space transformation and lighting compensation In order to apply to the real-time system, we adopt skin-color detection as the first step of face detection. We select this transform to detect human skin. However, the luminance of every image is different. It results that every image has different color distribution.  

High frequency noisy removing The main goal of this step is to enhance input image and also remove various type of noises. Noise is removed by using noise removal algorithm.

Edge Detection Edges are detected by using commands of image processing tool box in MATLAB.

Size Reduction A technique now commonly used for dimensionality reduction in computer vision particularly in face recognition is principal components analysis (PCA).

showing results of pre-processing step showing results after noise  removal showing results of edge detection

FACE FEATURE EXTRACTIONEye DetectionOne common method is to extract the shape of the eyes, nose, mouth and chin, and then distinguish the faces by distance and scale of those organs.

Feature 1 width of left eye Feature 2 width of right eye Feature 3 width of nose Feature 4 width of mouth corners Feature 5 width of face  

FACE DETECTION Face localization aims to determine the image position of a single face; this is a simplified detection problem with the assumption that an input image contains only one face.

DISTANCE MEASUREMENTIf the features have n-dimensions then the generalized Euclidean distance formula is used to measure the distance.

Emotion Recognition Detection of emotions is based on the calculation of distances between various features points. In this step comparison between distances of testing image and neutral image is done and also it selects the best possible match of testing image from train folder.

ALGORITHM

Step 1: Input a image f(x,y). Step 2: Apply Enhancement and Restoration process to detect face accurately and get g(x,y). Step 3: % compare image with emotion categories and each category have some different sort of images like happy, very happy or a sad showing that he is happy and store them a array emotion where I reflects section of emotion type and j reflecting section which actually have image related to i.  

For(i=1;i=section;size++)For(j=1;j=total images in i;j++) if (g(x like emotion[i][j])Result[i]= %age of results; %Result[i] gives the relative measurements of g(x,y) like emotion[i][j]; Break; Step 4: For(i=1;i=section size;i++)Print section "have" Result[i]; Step 5: Exit

PERFORMANCE ANALYSISThe experimental result shows that our algorithm can identify 30 emotions in our test image. Besides, the identification of emotions this algorithm also shows the distance of test image from neutral image and the best match of test image from trained images. There by our proposed algorithm is suitable for use in real-time systems with high performance.After implementing the algorithm for Facial Expression Recognition simulator I have used the performance or output of these results to compare with other method's results.

Input image Outputanger=00.00%happy=90.00%sad=04.50%surprised=03.30%neutral=02.20%

CONCLUSION

In this technique we have analyzed the limitations of existing system Emotion recognition using brain activity. In 'Emotion recognition using brain activity' brain activities using EEG signals has been used which is toughest task to do as it become expensive, complex and also time consuming when we try to measure human brain with Electroencephalography (eeg).Even when they have used existing data their result of analysis were 31 to 81 percentages correct and from which even by using Fuzzy logic 72 to 81 percentages only for two classes of emotions were achieved. This technique also gives us idea that we can sense human emotions also by reading and comparing the faces with images or data which is stored in knowledge base. In this technique by using a system which is trained by neural networks we have achieved up to 97 percent accurate results.

FUTURE WORK

However ,we have seen that we have got good results by using a simulator which uses neural network. But still this simulator has some limitations like at a time it will give results in one answer like Yes/No so for future work we will try to add Fuzzy logic membership function in it as one input may belong to different areas than only to a single one. 

REFERENCESA Method for Face Recognition from Facial Expression By Sarbani Ghosh and Samir K. Bandyopadhyay

I.J. Image, Graphics and Signal Processing, 2012, 8, 50-56Published Online August 2012 in MECS (http://www.mecs-press.org/) DOI: 10.5815/ijigsp.2012.08.07

Neha Gupta and Prof. Navneet Kaur / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 3, Issue 4, Jul-Aug 2013, pp.2002-2006

https://en.wikipedia.org/wiki/Emotion_Recognition

Thank You

top related