Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/27448
Title: Rapid Localization and Mapping Method Based on Adaptive Particle Filters
Authors: Charroud, A
El Moutaouakil, K
Yahyaouy, A
Onyekpe, U
Palade, V
Huda, MN
Keywords: autonomous driving;feature extraction;mapping;localization;self-driving vehicles;SLAM
Issue Date: 2-Dec-2022
Publisher: MDPI
Citation: Charroud, A. et al. (2022) 'Rapid Localization and Mapping Method Based on Adaptive Particle Filters', Sensors,, 22 (23), 9439, pp. 1 - 18. doi: /10.3390/s22239439.
Abstract: Copyright © 2022 by the authors. With the development of autonomous vehicles, localization and mapping technologies have become crucial to equip the vehicle with the appropriate knowledge for its operation. In this paper, we extend our previous work by prepossessing a localization and mapping architecture for autonomous vehicles that do not rely on GPS, particularly in environments such as tunnels, under bridges, urban canyons, and dense tree canopies. The proposed approach is of two parts. Firstly, a K-means algorithm is employed to extract features from LiDAR scenes to create a local map of each scan. Then, we concatenate the local maps to create a global map of the environment and facilitate data association between frames. Secondly, the main localization task is performed by an adaptive particle filter that works in four steps: (a) generation of particles around an initial state (provided by the GPS); (b) updating the particle positions by providing the motion (translation and rotation) of the vehicle using an inertial measurement device; (c) selection of the best candidate particles by observing at each timestamp the match rate (also called particle weight) of the local map (with the real-time distances to the objects) and the distances of the particles to the corresponding chunks of the global map; (d) averaging the selected particles to derive the estimated position, and, finally, using a resampling method on the particles to ensure the reliability of the position estimation. The performance of the newly proposed technique is investigated on different sequences of the Kitti and Pandaset raw data with different environmental setups, weather conditions, and seasonal changes. The obtained results validate the performance of the proposed approach in terms of speed and representativeness of the feature extraction for real-time localization in comparison with other state-of-the-art methods.
Description: Data Availability Statement: In this article, the Kitti dataset was used [29] Carlevaris-Bianco, N.; Ushani, A.K.; Eustice, R.M. University of Michigan North Campus long-term vision and LiDAR dataset. Int. J. Robot. Res. 2016, 35, 1023–1035, which is available for free download online at https://doi.org/10.1177/0278364915614638 .
This paper is an extended version of our paper published in Charroud, A.; Yahyaouy, A.; El Moutaouakil, K.; Onyekpe, U. Localisation and Mapping of Self-Driving Vehicles Based on Fuzzy K-Means Clustering: A Non-Semantic Approach. In Proceedings of the 2022 International Conference on Intelligent Systems and Computer Vision (ISCV), Fez, Morocco, 18–20 May 2022. https://doi.org/10.1109/iscv54655.2022.9806102.
URI: https://bura.brunel.ac.uk/handle/2438/27448
DOI: https://doi.org/10.3390/s22239439
Other Identifiers: ORCID iD: Anas Charroud https://orcid.org/0000-0002-6425-3096
ORCID iD: Karim El Moutaouakil https://orcid.org/0000-0003-3922-5592
ORCID iD: Vasile Palade https://orcid.org/0000-0002-6768-8394
ORCID iD: Md Nazmul Huda https://orcid.org/0000-0002-5376-881X
9439
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).4.94 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons