Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/12072
Title: Automatic emotional state detection and analysis on embedded devices
Authors: Turabzadeh, Saeed
Advisors: Meng, H
Keywords: Emotional state detection;Signal, image and video processing;Embedded devices;FPGA;Machine learning and computer vision algorithms
Issue Date: 2015
Publisher: Brunel University London
Abstract: From the last decade, studies on human facial emotion recognition revealed that computing models based on regression modelling can produce applicable performance. In this study, an automatic facial expression real-time system was built and tested. The method is used in this study has been used widely in different areas such as Local Binary Pattern method, which has been used in many research projects in machine vision, and the K-Nearest Neighbour algorithm is method utilized for regression modelling. In this study, these two techniques has been used and implemented on the FPGA for the first time, on the side and joined together to great the model in such way to display a continues and automatic emotional state detection model on the monitor. To evaluate the effectiveness of the classifier technique for human emotion recognition from video, the model was designed and tested on MATLAB environment and then MATLAB Simulink environment that is capable of recognizing continuous facial expression in real time with a rate of 1 frame per second and implemented on a desktop PC. It has been evaluated in a testing dataset and the experimental results were promising with the accuracy of 51.28%. The datasets and labels used in this study are made from videos which, recorded twice from 5 participants while watching a video. In order to implement it in real-time in faster frame rate, the facial expression recognition system was built on FPGA. The model was built on Atlys™ Spartan-6 FPGA Development Board. It can perform continuously emotional state recognition in real time at a frame rate of 30 with the accuracy of 47.44%. A graphic user interface was designed to display the participant video in real time and also two dimensional predict labels of the emotion at the same time. This is the first time that automatic emotional state detection has been successfully implemented on FPGA by using LBP and K-NN techniques in such way to display a continues and automatic emotional state detection model on the monitor.
Description: This thesis was submitted for the award of Master of Philosophy and was awarded by Brunel University London
URI: http://bura.brunel.ac.uk/handle/2438/12072
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Electrical Engineering Theses

Files in This Item:
File Description SizeFormat 
FulltextThesis.pdf3.74 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.