Please use this identifier to cite or link to this item:
Title: Brainwave-Based Human Emotion Estimation using Deep Neural Network Models for Biofeedback
Authors: Liu, Jingxin
Advisors: Meng, H
Nandi, A
Keywords: EEG;Emotion recognition
Issue Date: 2019
Publisher: Brunel University London
Abstract: Emotion is a state that comprehensively represents human feeling, thought and behavior, thus takes an important role in interpersonal human communication. Emotion estimation aims to automatically discriminate different emotional states by using physiological and nonphysiological signals acquired from human to achieve effective communication and interaction between human and machines. Brainwaves-Based Emotion Estimation is one of the most common used and efficient methods for emotion estimation research. The technology reveals a great role for human emotional disorder treatment, brain computer interface for disabilities, entertainment and many other research areas. In this thesis, various methods, schemes and frameworks are presented for Electroencephalogram (EEG) based human emotion estimation. Firstly, a hybrid dimension featurere duction scheme is presented using a total of 14 different features extracted from EEG recordings. The scheme combines these distinct features in the feature space using both supervised and unsupervised feature selection processes. Maximum Relevance Minimum Redundancy (mRMR) is applied to re-order the combined features into max-relevance with the emotion labels and min-redundancy of each feature. The generated features are further reduced with Principal Component Analysis (PCA) for extracting the principal components. Experimental results show that the proposed work outperforms the state-of-art methods using the same settings at the publicly available Database for Emotional Analysis using Physiological Signals (DEAP) data set. Secondly, a disentangled adaptive noise learning β-Variational autoencoder (VAE) combinewithlongshorttermmemory(LSTM)modelwasproposedfortheemotionrecognition based on EEG recordings. The experiment is also based on the EEG emotion public DEAPdataset. At first, the EEG time-series data are transformed into the Video-like EEG image data through the Azimuthal Equidistant Projection (AEP) to original EEG-sensor 3-D coordinates to perform 2-D projected locations of electrodes. Then Clough-Tocher scheme is applied for interpolating the scattered power measurements over the scalp and for estimating the values in-between the electrodes over a 32x32 mesh. After that, the βVAE LSTM algorithm is used to estimate the accuracy of the quadratic (arousal-valence) classification. The comparison between the β VAE-LSTM model and other classic methods is conducted at the same experimental setting that shows that the proposed model is effective. Finally, a novel real-time emotion detection system based on the EEG signals from a portable headband was presented, integrated into the interactive film ‘RIOT’. At first, the requirement of the interactive film was collected and the protocol for data collection using a portable EEG sensor (Emotiv Epoc) was designed. Then, a portable EEG emotion database (PEED) is built from 10 participants with the emotion labels using both self-reporting and video annotation tools. After that, various feature extraction, feature selection, validation scheme and classification methods are explored to build a practical system for the real-time detection. In the end, the emotion detection system is trained and integrated into the interactive film for real-time implementation and fully evaluated. The experimental results demonstrate the system with satisfied emotion detection accuracy and real-time performance.
Description: This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University London
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Theses

Files in This Item:
File Description SizeFormat 
FulltextThesis.pdfFile embargoed until 15/5/20194.62 MBAdobe PDFView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.