Please use this identifier to cite or link to this item: http://hdl.handle.net/10603/7529
Title: MULTIMODAL EMOTION RECOGNITION FOR ENHANCING HUMAN COMPUTER INTERACTION
Researcher: Khanna Preeti
Guide(s): Dr. Sasikumar M.
Keywords: Computer
EMOTION RECOGNITION
Upload Date: 13-Mar-2013
University: Narsee Monjee Institute of Management Studies
Completed Date: 12/12/2012
Abstract: Emotions are fundamental to human lives and play an important role in newlinecommunication between people. This interactional phenomenon has been extensively newlinestudied by several research groups worldwide. Enabling computers to read emotions newlinehas been motivated by the potentially wide range of applications which involve newlinehuman-computer interfaces. Such an affective computer system is an active area of newlineresearch, which include recognizing emotions as well as generating appropriate newlineresponses. This dissertation addresses the emerging area of research on emotions and newlinehuman computer interaction. newlineWe present approaches for recognition of emotions by considering static image of newlineface, speech and keyboard stroke pattern modalities. Each of these modalities has newlinebeen investigated separately as unimodal systems. Recognizing the inadequacy of newlineunimodal approaches, the study also explores multimodal emotion recognition newlineframework by combining these modalities. The work involves identifying and newlineextracting relevant features from each of these modalities and using various newlineclassification algorithms for classifying them into emotions. newlineThe features extracted from the face include the position of eyes, eyebrows, mouth, newlineand nose and various distances involving them. Features of interest in speech are newlinepitch, formant frequencies, parameters related to voiced and unvoiced region and newlineMFCC. The features extracted from keyboard interaction are typing speed, number of newlinebackspaces used, number of errors made, etc. Each of these features was analyzed newlineindividually to see its variation across emotions. We have used classification newlinealgorithms such as neural networks, bayesian classification etc. to classify the newlineemotions. Various studies involving different subsets of features and different newlineclassification algorithms were carried out to identify relevant features and newlineclassification framework. The performances were also evaluated for gender newlinedependency across emotions for different subset of features and classification newlinealgorithms. newlineThe
Pagination: 233
URI: http://hdl.handle.net/10603/7529
Appears in Departments:Department of Computer Engineering

Files in This Item:
File Description SizeFormat 
01_cover pg.pdfAttached File120.03 kBAdobe PDFView/Open
02_table of content.pdf116.35 kBAdobe PDFView/Open
03_chapter 1.pdf148.28 kBAdobe PDFView/Open
04_chapter 2.pdf274.84 kBAdobe PDFView/Open
05_chapter 3.pdf1.67 MBAdobe PDFView/Open
06_chapter 4.pdf1.56 MBAdobe PDFView/Open
07_chapter 5.pdf622.18 kBAdobe PDFView/Open
08_chapter 6.pdf1.16 MBAdobe PDFView/Open
09_chapter 7.pdf319.95 kBAdobe PDFView/Open
10_chapter 8.pdf98.36 kBAdobe PDFView/Open
11_chapter 9.pdf94.86 kBAdobe PDFView/Open
12_chapter 10 and 11.pdf133.3 kBAdobe PDFView/Open
Show full item record


Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).

Altmetric Badge: