Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/32076| Title: | Fast Skill Transfer Method for Peg-in-Hole Assembly Tasks Under Varied Visual Conditions |
| Authors: | Wu, K Chen, Q Zhao, H Wang, M |
| Keywords: | assembly;reinforcement learning;assembly skill learning;skill transfer learning |
| Issue Date: | 6-Oct-2025 |
| Publisher: | Institute of Electrical and Electronics Engineers (IEEE) |
| Citation: | Wu, K. et al (2025) 'Fast Skill Transfer Method for Peg-in-Hole Assembly Tasks Under Varied Visual Conditions', IEEE Robotics and Automation Letters, 10 (11), pp. 11792 - 11799. doi: 10.1109/LRA.2025.3617389. |
| Abstract: | Deep Reinforcement Learning (DRL) has emerged as a transformative approach in robotic assembly, offering unparalleled adaptability and efficiency in automating complex tasks. However, existing DRL methods with weak generalization require retraining of policy when facing new assembly scenarios, which require a significant amount of interaction and may harm the robots or parts. This paper presents a fast skill transfer approach for submillimeter-level assembly tasks. The approach enables rapid adaptation to varying textures and lighting variations, which are commonly encountered in flexible manufacturing environments. The model parameters can be quickly adjusted to facilitate seamless adaptation. Specifically, a concise distance-based encoder model is proposed to extract the latent representation from the low dimensional seam-based image (SBI) and map the extracted feature to the distance space. Then, the fine-tuning strategy is used to align the features of new scenes with those in the source scenes. The transfer strategy necessitates only the retraining of the feature extraction model, obviating the need to retrain the underlying RL policy. Simulation and real-world experiments are conducted to evaluate the proposed method, and the transfer can be finished in a few minutes. The policy trained in the simulation can be transferred to the different real-world assembly scenes with the proposed method with an average success rate of 94.3%, highlighting its potential for practical applications. |
| URI: | https://bura.brunel.ac.uk/handle/2438/32076 |
| DOI: | https://doi.org/10.1109/LRA.2025.3617389 |
| Other Identifiers: | ORCiD: Kai Wu https://orcid.org/0000-0002-9475-0659 ORCiD: Qi Chen https://orcid.org/0009-0005-3166-1893 ORCiD: Huan Zhao https://orcid.org/0000-0002-1589-5375 ORCiD: Mingfeng Wang https://orcid.org/0000-0001-6551-0325 |
| Appears in Collections: | Dept of Mechanical and Aerospace Engineering Embargoed Research Papers |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| FullText.pdf | “For the purpose of open access, the author(s) has applied a Creative Commons Attribution (CC BY) license to any Accepted Manuscript version arising.” | 5.47 MB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License