Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30818
Title: Lightweight Facial Attractiveness Prediction Using Dual Label Distribution.
Authors: Liu, S
Huang, E
Xu, Y
Wang, K
Kui, X
Lei, T
Meng, H
Keywords: facial attractiveness prediction;dual label distribution;lightweight;label distribution learning
Issue Date: 13-Jan-2025
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Liu, S. et al. (2025) 'Lightweight Facial Attractiveness Prediction Using Dual Label Distribution.', IEEE Transactions on Cognitive and Developmental Systems, 0 (early access), pp. 1 - 11. doi: 10.1109/TCDS.2025.3529177.
Abstract: Facial attractiveness prediction (FAP) aims to assess facial attractiveness automatically based on human aesthetic perception. Previous methods using deep convolutional neural networks have improved the performance, but their large-scale models have led to a deficiency in efficiency. In addition, most methods fail to take full advantage of the dataset. In this paper, we present a novel end-to-end FAP approach that integrates dual label distribution and lightweight design. The manual ratings, attractiveness score, and standard deviation are aggregated explicitly to construct a dual-label distribution to make the best use of the dataset, including the attractiveness distribution and the rating distribution. Such distributions, as well as the attractiveness score, are optimized under a joint learning framework based on the label distribution learning (LDL) paradigm. The data processing is simplified to a minimum for a lightweight design, and MobileNetV2 is selected as our backbone. Extensive experiments are conducted on two benchmark datasets, where our approach achieves promising results and succeeds in balancing performance and efficiency. Ablation studies demonstrate that our delicately designed learning modules are indispensable and correlated. Additionally, the visualization indicates that our approach can perceive facial attractiveness and capture attractive facial regions to facilitate semantic predictions. The code is available at https://github.com/enquan/2D_FAP.
URI: https://bura.brunel.ac.uk/handle/2438/30818
DOI: https://doi.org/10.1109/TCDS.2025.3529177
ISSN: 2379-8920
Other Identifiers: ORCiD: Hongying Meng https://orcid.org/0000-0002-8836-1382
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2025 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works (https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/).743.19 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.