Please use this identifier to cite or link to this item:
http://hdl.handle.net/10603/7529
Title: | MULTIMODAL EMOTION RECOGNITION FOR ENHANCING HUMAN COMPUTER INTERACTION |
Researcher: | Khanna Preeti |
Guide(s): | Dr. Sasikumar M. |
Keywords: | Computer EMOTION RECOGNITION |
Upload Date: | 13-Mar-2013 |
University: | Narsee Monjee Institute of Management Studies |
Completed Date: | 12/12/2012 |
Abstract: | Emotions are fundamental to human lives and play an important role in newlinecommunication between people. This interactional phenomenon has been extensively newlinestudied by several research groups worldwide. Enabling computers to read emotions newlinehas been motivated by the potentially wide range of applications which involve newlinehuman-computer interfaces. Such an affective computer system is an active area of newlineresearch, which include recognizing emotions as well as generating appropriate newlineresponses. This dissertation addresses the emerging area of research on emotions and newlinehuman computer interaction. newlineWe present approaches for recognition of emotions by considering static image of newlineface, speech and keyboard stroke pattern modalities. Each of these modalities has newlinebeen investigated separately as unimodal systems. Recognizing the inadequacy of newlineunimodal approaches, the study also explores multimodal emotion recognition newlineframework by combining these modalities. The work involves identifying and newlineextracting relevant features from each of these modalities and using various newlineclassification algorithms for classifying them into emotions. newlineThe features extracted from the face include the position of eyes, eyebrows, mouth, newlineand nose and various distances involving them. Features of interest in speech are newlinepitch, formant frequencies, parameters related to voiced and unvoiced region and newlineMFCC. The features extracted from keyboard interaction are typing speed, number of newlinebackspaces used, number of errors made, etc. Each of these features was analyzed newlineindividually to see its variation across emotions. We have used classification newlinealgorithms such as neural networks, bayesian classification etc. to classify the newlineemotions. Various studies involving different subsets of features and different newlineclassification algorithms were carried out to identify relevant features and newlineclassification framework. The performances were also evaluated for gender newlinedependency across emotions for different subset of features and classification newlinealgorithms. newlineThe |
Pagination: | 233 |
URI: | http://hdl.handle.net/10603/7529 |
Appears in Departments: | Department of Computer Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
01_cover pg.pdf | Attached File | 120.03 kB | Adobe PDF | View/Open |
02_table of content.pdf | 116.35 kB | Adobe PDF | View/Open | |
03_chapter 1.pdf | 148.28 kB | Adobe PDF | View/Open | |
04_chapter 2.pdf | 274.84 kB | Adobe PDF | View/Open | |
05_chapter 3.pdf | 1.67 MB | Adobe PDF | View/Open | |
06_chapter 4.pdf | 1.56 MB | Adobe PDF | View/Open | |
07_chapter 5.pdf | 622.18 kB | Adobe PDF | View/Open | |
08_chapter 6.pdf | 1.16 MB | Adobe PDF | View/Open | |
09_chapter 7.pdf | 319.95 kB | Adobe PDF | View/Open | |
10_chapter 8.pdf | 98.36 kB | Adobe PDF | View/Open | |
11_chapter 9.pdf | 94.86 kB | Adobe PDF | View/Open | |
12_chapter 10 and 11.pdf | 133.3 kB | Adobe PDF | View/Open |
Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).
Altmetric Badge: