Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/11324
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Aung, M | - |
dc.contributor.author | Kaltwang, S | - |
dc.contributor.author | Romera-Paredes, B | - |
dc.contributor.author | Martinez, B | - |
dc.contributor.author | Singh, A | - |
dc.contributor.author | Cella, M | - |
dc.contributor.author | Valstar, M | - |
dc.contributor.author | Meng, H | - |
dc.contributor.author | Kemp, A | - |
dc.contributor.author | Shafizadeh, M | - |
dc.contributor.author | Elkins, A | - |
dc.contributor.author | Kanakam, N | - |
dc.contributor.author | Rothschild, A | - |
dc.contributor.author | Tyler, N | - |
dc.contributor.author | Watson, P | - |
dc.contributor.author | Williams, A | - |
dc.contributor.author | Pantic, M | - |
dc.contributor.author | Bianchi-Berthouze, N | - |
dc.date.accessioned | 2015-09-07T12:54:47Z | - |
dc.date.available | 2015-06 | - |
dc.date.available | 2015-09-07T12:54:47Z | - |
dc.date.issued | 2015 | - |
dc.identifier.citation | IEEE Transactions on Affective Computing, 2015 | en_US |
dc.identifier.issn | 1949-3045 | - |
dc.identifier.uri | http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7173007&sortType%3Dasc_p_Sequence%26filter%3DAND(p_Publication_Number%3A5165369)%26rowsPerPage%3D50 | - |
dc.identifier.uri | http://bura.brunel.ac.uk/handle/2438/11324 | - |
dc.description.abstract | Pain -related emotions are a major barrier to effective self rehabilitation in chronic pain. Automated coaching systems capable of detecting these emotions are a potential solution. This paper lays the foundation for the development of such systems by making three contributions. First, through literature reviews, an overview of how chronic pain is expressed and the motivation for detecting it in physical rehabilitation is provided. Second, a fully labelled multimodal dataset containing high resolution multiple-view face videos, head mounted and room audio signals, full body 3-D motion capture and electromyographic signals from back muscles is supplied. Natural unconstrained pain related facial expressions and body movement behaviours were elicited from people with chronic pain carrying out physical exercises. Both instructed and non-instructed exercises where considered to reflect different rehabilitation scenarios. Two sets of labels were assigned: level of pain from facial expressions annotated by eight raters and the occurrence of six pain-related body behaviours segmented by four experts. Third, through exploratory experiments grounded in the data, the factors and challenges in the automated recognition of such expressions and behaviour are described, the paper concludes by discussing potential avenues in the context of these findings also highlighting differences for the two exercise scenarios addressed. | en_US |
dc.description.sponsorship | This work was funded by the EPSRC Emotion & Pain Project (EP/H017178/1, EP/H017194/1, EP/H016988/1). | en_US |
dc.language.iso | en | en_US |
dc.publisher | IEEE | en_US |
dc.subject | Chronic low back pain | en_US |
dc.subject | Emotion | en_US |
dc.subject | Pain behaviour | en_US |
dc.subject | Body movement | en_US |
dc.subject | Facial expression | en_US |
dc.subject | Electromyography | en_US |
dc.subject | Motion capture | en_US |
dc.subject | Automatic emotion recognition | en_US |
dc.subject | Multimodal database | en_US |
dc.title | The automatic detection of chronic pain-related expression: requirements, challenges and a multimodal dataset | en_US |
dc.type | Article | en_US |
dc.identifier.doi | http://dx.doi.org/10.1109/TAFFC.2015.2462830 | - |
dc.relation.isPartOf | IEEE Transactions on Affective Computing | - |
pubs.publication-status | Accepted | - |
pubs.publication-status | Accepted | - |
Appears in Collections: | Dept of Electronic and Electrical Engineering Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Fulltext.pdf | 949.04 kB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.