IRJET - Emotion Recognition from Facial Expression using Deep Learning

Page 1

International Research Journal of Engineering and Technology (IRJET)

e-ISSN: 2395-0056

Volume: 08 Issue: 08 | Aug 2021

p-ISSN: 2395-0072

www.irjet.net

Emotion Recognition from Facial Expression Using Deep Learning Deby Sarker1, Tania Sultana2, A.S. Tariq3 12B.Sc.

Students, Dept. of Computer Science and Engineering, Bangladesh Army University of Engineering & Technology (BAUET), Natore-6431, Bangladesh 3Senior Lecturer, Dept. of Computer Science and Engineering, Bangladesh Army University of Engineering & Technology (BAUET), Natore-6431, Bangladesh ---------------------------------------------------------------------***----------------------------------------------------------------------

Abstract - Deep learning based emotion detection gives

performance better than traditional methods with image processing. It discusses about the procedure of emotion detection, which includes basically three main steps: face detection, feature extraction, emotion classification. This paper presents the design of an artificial intelligence (AI) system capable of emotion through facial expression. In the deep Convolutional Neural Network (CNN) fully-connected layers we employ a regularization method called “dropout” that proved to be very effective to reduce over-fitting. The system has been successfully classified seven basic emotion classes. Thus, the proposed method is proven to be effective for emotion recognition.

Fig. 1: Facial Expression Recognition System Flowchart. The next step after detecting a face from a given image is the feature extraction process that helps isolate the important facial areas. Two methods can be used for feature extraction: • Analytic approach. • Holistic approach.

Key Words: Convolutional Neural Network; Deep Learning; Artificial Intelligence; Emotion Recognition.

The holistic approach uses raw facial image as input, while the analytic approach focuses on some of the important facial features are been detected and extracted from face. The analytic approach is being used in this paper, where we the extracted selected features that were obtained using edge detection from the image are sent as input to a classifier. The holistic approach takes the global properties of the patterns being obtained into consideration while the analytic approach we used computes a set of geometrical features based on feature vectors.

1. INTRODUCTION Human beings communicate with each other in the form of speech, gestures and emotions. As such systems that can recognize the same are in great demand in many fields. With respect to artificial intelligence, a computer will be able to interact with humans much more naturally if they are capable of understanding human emotion [1]. It would also help during counseling and other health care related fields. Emotion is a mental state connected with the nervous system linked with feeling, perceptions, behavioral reactions, and a degree of pleasure or irritation. One of the present applications of artificial intelligence (AI) using neural networks is the recognition of faces in images [2]. Most techniques process visual data and search for normal pattern present in human faces in images. Face detection can be used for surveillance purposes by law enacts as well as in crowd administration [3]. In this paper, we present a method for identifying seven emotions such as neutral, anger, sad, disgust, fear, happy and surprise using facial images. Previous used deep-learning technology to create models of facial expressions based on emotions to identify emotions.

3. BACKGROUND STUDY 3.1 Related Work On the following paragraphs, approaches involving the use of feature engineering are introduced. While the approaches on feature extraction and classification are different; all of them involved Cohn-Kanade dataset, as part of its work. It is worth to mention that Cohn-Kanade dataset was used in this research, so the results give a valuable comparison. Kotsia et al. focused on the effect of occlusion when classifying 6 facial emotion expressions. In order to achieve this, several feature engineering techniques and classification models were combined. Gabor features, which is a linear filter used for edge detection, and Discriminant Nonnegative Matrix Factorization (DNMF), which focuses on the nonnegativity of the data to be handled, are the feature extractors techniques. To classify these features multiclass support vector machine (SVM) and multi-layer perceptron (MLP) were used. The results over Cohn-Kanade are the following: Using a MLP with Gabor 91.6% and with DNMF: 86.7%. While using SVM it was 91.4%. Another corpus used was JAFFE:

2. FACIAL EXPRESSION RECOGNITION Identifying or verifying a person expression is the main task of facial expression recognition which is Combined with some important steps. Figure 1 demonstrates the basic process of facial expression recognition.

© 2021, IRJET

|

Impact Factor value: 7.529

|

ISO 9001:2008 Certified Journal

|

Page 701


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.