Automatic Facial Emotion Recognition Method Based on Eye Region Changes
Author(s):
Abstract:
Emotion is expressed via facial muscle movements, speech, body and hand gestures, and various biological signals like heart beating. However, the most natural way that humans display emotion is facial expression. Facial expression recognition is a great challenge in the area of computer vision for the last two decades. This paper focuses on facial expression to identify seven universal human emotions i.e. anger, disgust, fear, happiness, sadness, surprise, and neu7tral. Unlike the majority of other approaches which use the whole face or interested regions of face, we restrict our facial emotion recognition (FER) method to analyze human emotional states based on eye region changes. The reason of using this region is that eye region is one of the most informative regions to represent facial expression. Furthermore, it leads to lower feature dimension as well as lower computational complexity. The facial expressions are described by appearance features obtained from texture encoded with Gabor filter and geometric features. The Support Vector Machine with RBF and poly-kernel functions is used for proper classification of different types of emotions. The Facial Expressions and Emotion Database (FG-Net), which contains spontaneous emotions and Cohn-Kanade(CK) Database with posed emotions have been used in experiments. The proposed method was trained on two databases separately and achieved the accuracy rate of 96.63% for spontaneous emotions recognition and 96.6% for posed expression recognition, respectively.
Keywords:
Language:
English
Published:
Journal of Information Systems and Telecommunication, Volume:4 Issue: 4, Oct -Dec 2016
Page:
221
https://www.magiran.com/p1635659
سامانه نویسندگان
مقالات دیگری از این نویسنده (گان)
-
DynamicCluStream: An algorithm Based on CluStream to Improve Clustering Quality
Sahar Ahsani, Morteza Yousef Sanati *,
International Journal of Web Research, Autumn-Winter 2023