International Research Journal of Engineering and Technology (IRJET)
e-ISSN: 2395-0056
Volume: 05 Issue: 03 | Mar-2018
p-ISSN: 2395-0072
www.irjet.net
Review on Mood Detection using Image Processing and Chatbot using Artificial Intelligence Prof. D.S.Thosar1, Varsha Gothe2, Priyanka Bhorkade3, Varsha Sanap4 1Asst.
Professor, PRES’s SVIT, Nasik, Maharashtra, India PRES’s SVIT, Nasik, Maharashtra, India ---------------------------------------------------------------------***--------------------------------------------------------------------built. Here in this phase the chatting will be done between Abstract - Human gesture is the thing which plays a very 2,3,4Student
interesting role in general life application. It can be easily recognize using image processing. Let us consider an example of driver’s gesture who is currently driving the vehicle and it will be quiet useful in case of alerting him when he is in sleepy mood. We can identify the human gesture by observing the movements of eyes, nose, brows, cheeks which may vary with time. The proposed system is introduced for recognizing the expressions by focusing on human face. There were two implementation the approach is based on that is face detection classifier and finding and matching of simple token. There is one more approach we have adapted i.e. chatbot which is built using artificial intelligence. Using chatting application system let the user to chat with bot and this leads to identifying the users’ mood on the basis of text or speech using text processing. Considering the both approaches the system will be able to provide jokes, songs and links to webpages by recognizing the users’ response.
human and bot and which doesn’t let the user understand that is talking to a bot actually. With the help of chatting application the system will get the users’ response in form of text or gestures. The system will detect the mood by recognizing the text or gestures using text processing and data mining.
Key Words: Artificial Intelligence, Chatbot, Data Mining, Image Processing, Text Processing
The atmosphere of the music describes the intrinsic emotional meaning of a musical clip. It is useful for musical understanding, musical research and some music-related applications. In this paper, we present a hierarchical structure to automate the task of detecting mood based on acoustic musical data, following some psychological theories of music in Western cultures. Three sets of characteristics, intensity, timbre and rhythm are extracted to represent the characteristics of a music clip. On the other hand, a mood tracking approach is also presented for an entire piece of music. Experimental evaluations indicate that the proposed algorithms produce satisfactory results. [1]
1. INTRODUCTION Facial expression recognition is depends upon mood detection process. It is a research problem which involves spanning fields and disciplines. There were numerous practical application which deals with facial expression recognition such as security monitoring, access control, and surveillance system. The behaviors corresponding to human face are used for various functions, which include Speech illustration, Emblematic Gestures and others. In case of Speech illustration, being inquisitive human often raise their brows and lower it with lowering the voices. Doubtful look is produced in Emblematic Gestures phase, the human raise the upper lip by pushing the lower lip up. Whenever the line occurs over the forehead then it goes with the stress mood oftenly. The changes in eyes can also let the system to recognize the mood of tiredness in users. The application will first of all capture the facial image via web camera and mood detection will be done. Inherent emotional meaning is nothing but the mood of person. The human dialogue may also play significant role in detection of mood. The approach will basically deals with artificial intelligence algorithm using which chatbot is © 2018, IRJET
|
Impact Factor value: 6.171
|
Our application will not only detect the users’ mood but also provide the relevant data from database for boosting the mood of user. For example, the system will automatically fetch the songs or jokes from database and send it on the users’ window terminal if user is in sad mood. And also system will able to provide some links to web pages of motivational speech. The data provide by system will boost the mood which make the user to work efficiently and leads to enhancement in performance.
2. LITERATURE SURVEY
The human face plays a prodigious role in the automatic recognition of emotions in the field of human emotion identification and human-computer interaction for real applications such as driver status monitoring, personalized learning, health monitoring, etc. . However, they are not considered dynamic characteristics independent of the subject, so they are not robust enough for the task of recognizing real life with the variation of the subject (human face), the movement of the head and the change of illumination. In this article, we tried to design an automated framework for detecting emotions using facial expression. For human-computer interaction, facial expression is a platform for non-verbal communication. Emotions are actually changing events that are evoked as a result of the driving force. Thus, in the application of real ISO 9001:2008 Certified Journal
|
Page 539