Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30818
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLiu, S-
dc.contributor.authorHuang, E-
dc.contributor.authorXu, Y-
dc.contributor.authorWang, K-
dc.contributor.authorKui, X-
dc.contributor.authorLei, T-
dc.contributor.authorMeng, H-
dc.date.accessioned2025-02-25T18:17:32Z-
dc.date.available2025-02-25T18:17:32Z-
dc.date.issued2025-01-13-
dc.identifierORCiD: Hongying Meng https://orcid.org/0000-0002-8836-1382-
dc.identifier.citationLiu, S. et al. (2025) 'Lightweight Facial Attractiveness Prediction Using Dual Label Distribution.', IEEE Transactions on Cognitive and Developmental Systems, 0 (early access), pp. 1 - 11. doi: 10.1109/TCDS.2025.3529177.en_US
dc.identifier.issn2379-8920-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/30818-
dc.description.abstractFacial attractiveness prediction (FAP) aims to assess facial attractiveness automatically based on human aesthetic perception. Previous methods using deep convolutional neural networks have improved the performance, but their large-scale models have led to a deficiency in efficiency. In addition, most methods fail to take full advantage of the dataset. In this paper, we present a novel end-to-end FAP approach that integrates dual label distribution and lightweight design. The manual ratings, attractiveness score, and standard deviation are aggregated explicitly to construct a dual-label distribution to make the best use of the dataset, including the attractiveness distribution and the rating distribution. Such distributions, as well as the attractiveness score, are optimized under a joint learning framework based on the label distribution learning (LDL) paradigm. The data processing is simplified to a minimum for a lightweight design, and MobileNetV2 is selected as our backbone. Extensive experiments are conducted on two benchmark datasets, where our approach achieves promising results and succeeds in balancing performance and efficiency. Ablation studies demonstrate that our delicately designed learning modules are indispensable and correlated. Additionally, the visualization indicates that our approach can perceive facial attractiveness and capture attractive facial regions to facilitate semantic predictions. The code is available at https://github.com/enquan/2D_FAP.en_US
dc.format.extent1 - 11-
dc.format.mediumPrint-Electronic-
dc.language.isoen_USen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rightsCopyright © 2025 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works (https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/).-
dc.rights.urihttps://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/-
dc.subjectfacial attractiveness predictionen_US
dc.subjectdual label distributionen_US
dc.subjectlightweighten_US
dc.subjectlabel distribution learningen_US
dc.titleLightweight Facial Attractiveness Prediction Using Dual Label Distribution.en_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.1109/TCDS.2025.3529177-
dc.relation.isPartOfIEEE Transactions on Cognitive and Developmental Systems-
pubs.issue00-
pubs.publication-statusPublished-
pubs.volume0-
dc.identifier.eissn2379-8939-
dc.rights.holderInstitute of Electrical and Electronics Engineers (IEEE)-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2025 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works (https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/).743.19 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.