Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/28667
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHu, C-
dc.contributor.authorGu, S-
dc.contributor.authorYang, M-
dc.contributor.authorHan, G-
dc.contributor.authorLai, CS-
dc.contributor.authorGao, M-
dc.contributor.authorYang, Z-
dc.contributor.authorMa, G-
dc.date.accessioned2024-03-31T15:50:23Z-
dc.date.available2024-03-31T15:50:23Z-
dc.date.issued2024-01-06-
dc.identifierORCiD: Chun Sing Lai https://orcid.org/0000-0002-4169-4438-
dc.identifier.citationu, C. et al. (2024) 'MDEmoNet: A Multimodal Driver Emotion Recognition Network for Smart Cockpit', 2024 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 6-8 January, pp. 1 - 6. doi: 10.1109/ICCE59016.2024.10444365.en_US
dc.identifier.isbn9798350324136 (ebk)-
dc.identifier.isbn979-8-3503-2414-3 (PoD)-
dc.identifier.issn0747-668X-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/28667-
dc.description.abstractThe automotive smart cockpit is an intelligent and connected in-vehicle consumer electronics product. It can provide a safe, efficient, comfortable, and enjoyable human-machine interaction experience. Emotion recognition technology can help the smart cockpit better understand the driver's needs and state, improve the driving experience, and enhance safety. Currently, driver emotion recognition faces some challenges, such as low accuracy and high latency. In this paper, we propose a multimodal driver emotion recognition model. To our best knowledge, it is the first time to improve the accuracy of driver emotion recognition by using facial video and driving behavior (including brake pedal force, vehicle Y-Axis position and Z-Axis position) as inputs and employing a multi-Task training approach. For verification, the proposed scheme is compared with some mainstream state-of-The-Art methods on the publicly available multimodal driver emotion dataset PPB-Emo.en_US
dc.description.sponsorship10.13039/501100001809-National Natural Science Foundation of China; 10.13039/100006190-Research and Development; 10.13039/501100003009-Science and Technology Development Fund.en_US
dc.format.mediumPrint-Electronic-
dc.language.isoen_USen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rightsCopyright © 2024 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishingethics/guidelines-and-policies/post-publication-policies/ for more information.-
dc.rights.urihttps://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishingethics/guidelines-and-policies/post-publication-policies/ for more information.-
dc.subjectsmart cockpiten_US
dc.subjectdriver emotion recognitionen_US
dc.subjectdeep learningen_US
dc.subjectmultimodal fusionen_US
dc.titleMDEmoNet: A Multimodal Driver Emotion Recognition Network for Smart Cockpiten_US
dc.typeConference Paperen_US
dc.identifier.doihttps://doi.org/10.1109/ICCE59016.2024.10444365-
dc.relation.isPartOfDigest of Technical Papers - IEEE International Conference on Consumer Electronics-
pubs.publication-statusPublished-
dc.identifier.eissn2159-1423-
dc.identifier.eissn2158-4001-
dc.rights.holderInstitute of Electrical and Electronics Engineers (IEEE)-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishingethics/guidelines-and-policies/post-publication-policies/ for more information.780.17 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.