Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/31936
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChen, Q-
dc.contributor.authorWu, K-
dc.contributor.authorZhong, Y-
dc.contributor.authorLi, W-
dc.contributor.authorWang, M-
dc.date.accessioned2025-09-08T07:15:23Z-
dc.date.available2025-09-08T07:15:23Z-
dc.date.issued2025-08-04-
dc.identifierORCiD: Qi Chen https://orcid.org/0009-0005-3166-1893-
dc.identifierORCiD: Mingfeng Wang https://orcid.org/0000-0001-6551-0325-
dc.identifier.citationChen, Q. et al. (2025) 'A 2-stage vision-based localization methodology for efficient automatic charging of electric vehicles in uncertain environments', Robotica, 43 (8), pp. 2992 - 3010. doi: 10.1017/S0263574725102038.en_US
dc.identifier.issn0263-5747-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/31936-
dc.descriptionData availability statement: The data that support the findings of this study are available from the corresponding author upon reasonable request.en_US
dc.description.abstractAutomatic visual localization of electric vehicle (EV) charging ports presents significant challenges in uncertain environments, such as varying surface textures, reflections, lighting and observation distance. Existing methods require extensive real-world training data and well-focused images to achieve robust and accurate localization. However, both requirements are difficult to meet under variable and unpredictable conditions. This paper proposes a 2-stage vision-based localization approach. Firstly, the image synthesis technique is used to reduce the cost of real-world data collection. A task-oriented parameterization protocol (TOPP) is proposed to optimize the quality of the synthetic images. Secondly, an autofocus and servoing strategy is proposed. A hybrid detector is employed to enhance sharpness assessment performance, while a visual servoing method based on single exponential smoothing (SES) is developed to enhance stability and efficiency during the search process. Experiments were conducted to evaluate image synthesis efficiency, detection accuracy, and servoing performance. The proposed method achieved 99% detection accuracy on the real-world port images, and guided the robot to the optimal imaging position within 16 s, outperforming comparable approaches. These results highlight its potential for robust automated charging in real-world scenarios.en_US
dc.description.sponsorshipFunding Research supported by the State Key Laboratory of Digital Manufacturing Equipment and Technology, Grant No. DMETKF2021018. GJYC program of GuangZhou, Grant ID. 2024D03J0005. Chunhui Project Foundation of the Education Department of China, Grant No. 202201789.en_US
dc.format.extent2992 - 3010-
dc.format.mediumPrint-Electronic-
dc.languageEnglish-
dc.language.isoenen_US
dc.publisherCambridge University Pressen_US
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivatives 4.0 International-
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/-
dc.subjectautomatic charging roboten_US
dc.subjectsynthetic imagesen_US
dc.subjectSim2real transfer learningen_US
dc.subjectvision-based servoingen_US
dc.titleA 2-stage vision-based localization methodology for efficient automatic charging of electric vehicles in uncertain environmentsen_US
dc.typeArticleen_US
dc.date.dateAccepted2025-07-02-
dc.identifier.doihttps://doi.org/10.1017/S0263574725102038-
dc.relation.isPartOfRobotica-
pubs.issue8-
pubs.publication-statusPublished-
pubs.volume43-
dc.identifier.eissn1469-8668-
dc.rights.licensehttps://creativecommons.org/licenses/by-nc-nd/4.0/legalcode.en-
dcterms.dateAccepted2025-07-02-
dc.rights.holderCambridge University Press-
dc.contributor.orcidChen, Qi [0009-0005-3166-1893]-
dc.contributor.orcidWang, Mingfeng [0000-0001-6551-0325]-
Appears in Collections:Department of Mechanical and Aerospace Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2025 Cambridge University Press. This article has been published in a revised form in Robotica, https://doi.org/10.1017/S0263574725102038. This version is free to view and download for private research and study only. Not for re-distribution, re-sale or use in derivative works. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (https://creativecommons.org/licenses/by-nc-nd/4.0/) (see: https://www.cambridge.org/core/services/open-access-policies/open-access-books/green-open-access-policy-for-books).2.9 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons