Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/32923Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Bingol, EC | - |
| dc.contributor.author | Al-Raweshidy, H | - |
| dc.contributor.author | Banitsas, K | - |
| dc.date.accessioned | 2026-03-03T08:58:09Z | - |
| dc.date.available | 2026-03-03T08:58:09Z | - |
| dc.date.issued | 2026-03-02 | - |
| dc.identifier | ORCiD: Emre Can Bingol https://orcid.org/0009-0005-0448-6372 | - |
| dc.identifier | ORCiD: Hamed Al-Raweshidy https://orcid.org/0000-0002-3702-8192 | - |
| dc.identifier | ORCiD: Konstantinos Banitsas https://orcid.org/0000-0003-2658-3032 | - |
| dc.identifier.citation | Bingol, E.C., Al-Raweshidy, H. and Banitsas, K. (2026) 'Vision-Based Dual-Mode Collision Risk-Warning for Aircraft Apron Monitoring', Drones, 10, 173, pp. 1–41. doi: 10.3390/drones10030173. | en-US |
| dc.identifier.uri | https://bura.brunel.ac.uk/handle/2438/32923 | - |
| dc.description | Highlights: What are the main findings? • Under identical detector inputs (optimised YOLOv8-Seg) and without tracker specific tuning, DeepSORT delivered the most stable identity tracking on the 997-frame Microsoft Flight Simulator (MSFS) simulation-based incident reenactment benchmark using the airplane-only MOTChallenge ground truth: Multi-Object Tracking Accuracy (MOTA) 92.77%, recall 93.27%, and one ID switch. • A dual-mode incident-warning framework was developed: (i) a reactive module based on segmentation-mask proximity and (ii) a proactive module based on short horizon trajectory extrapolation and future-Intersection-over-Union (IoU) risk triggering. The modules can be used independently or jointly. What are the implications of the main findings? • The MSFS reenactment sequence and its associated labels provide a reproducible testbed that helps mitigate the scarcity of annotated apron-incident data for detection, tracking and risk studies. • A scaled Unmanned Aerial Vehicle (UAV)/laboratory validation protocol is defined to assess end-to-end feasibility on UAV-captured imagery (reported qualitatively via representative frames and warning overlays). | en-US |
| dc.description | Data Availability Statement: The dataset and annotations generated in this study are available on study are available on Roboflowat: https://app.roboflow.com/ecb-wba09/mot-challange-dataset/browse?query-Text=&pageSize=50&startingIndex=0&browseQuery=true (accessed on 28 January 2026). | en-US |
| dc.description.abstract | Ground incidents on airport aprons can cause substantial operational disruption and economic loss, while conventional surveillance (e.g., Surface Movement Radar (SMR), Closed-Circuit Television (CCTV)) often lacks the resolution and proactive decision support required for close-proximity operations. This study proposes a UAV-deployable, camera-agnostic Computer Vision (CV) framework for collision-risk warning from elevated viewpoints. An optimised YOLOv8-Seg backbone performs multi-class aircraft segmentation (airplane, wing, nose, tail, and fuselage) and is integrated with four MOT algorithms under identical evaluation settings. For quantitative tracker benchmarking, DeepSORT provides the strongest overall performance on the airplane-only MOTChallenge-format ground truth (MOTA 92.77%, recall 93.27%). To mitigate the scarcity of annotated apron-incident data, a labelled 997-frame MOT dataset is created via an MSFS simulation-based reenactment inspired by the 2018 Asiana–Turkish Airlines wing-to-tail event at Istanbul Ataturk Airport. The framework further introduces a dual-module warning mechanism that can operate independently: (i) a reactive module using image-plane proximity derived from segmentation masks, and (ii) a proactive module that predicts short-horizon conflicts via trajectory extrapolation and IoU-based future overlap analysis. The approach is evaluated on multiple simulated incident scenarios and assessed on a real apron video from Hong Kong International Airport; additionally, laboratory-scale UAV experiments using diecast aircraft models provide end-to-end feasibility evidence on unmanned-platform imagery. Overall, the results indicate timely warnings and practical feasibility for low-overhead UAV-enabled apron monitoring. | en-US |
| dc.description.sponsorship | This research received no specific external project funding. The first author is supported by a PhD scholarship from the Ministry of National Education of Türkiye (no grant number). This scholarship did not directly fund the work reported in this paper. | en-US |
| dc.format.extent | 1–41 | - |
| dc.format.medium | Electronic | - |
| dc.language.iso | en_US | en-US |
| dc.publisher | MDPI | en-US |
| dc.rights | Creative Commons Attribution 4.0 International | - |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | - |
| dc.subject | unmanned aerial systems | en-US |
| dc.subject | collision-risk warning | en-US |
| dc.subject | apron safety | en-US |
| dc.subject | simulation-based reenactment | en-US |
| dc.subject | YOLOv8-Seg | en-US |
| dc.subject | DeepSORT | en-US |
| dc.subject | multi-object tracking | en-US |
| dc.subject | aircraft tracking | en-US |
| dc.subject | trajectory prediction | en-US |
| dc.subject | computer vision | en-US |
| dc.subject | UAV monitoring | en-US |
| dc.title | Vision-Based Dual-Mode Collision Risk-Warning for Aircraft Apron Monitoring | en-US |
| dc.type | Article | en-US |
| dc.date.dateAccepted | 2026-02-27 | - |
| dc.identifier.doi | https://doi.org/10.3390/drones10030173 | - |
| dc.relation.isPartOf | Drones | - |
| pubs.publication-status | Published | - |
| pubs.volume | 10 | - |
| dc.identifier.eissn | 2504-446X | - |
| dc.rights.license | https://creativecommons.org/licenses/by/4.0/legalcode.en | - |
| dcterms.dateAccepted | 2026-02-27 | - |
| dc.rights.holder | The Authors | - |
| dc.contributor.orcid | Bingol, Emre Can [0009-0005-0448-6372] | - |
| dc.contributor.orcid | Al-Raweshidy, Hamed [0000-0002-3702-8192] | - |
| dc.contributor.orcid | Banitsas, Konstantinos [0000-0003-2658-3032] | - |
| dc.identifier.number | 173 | - |
| Appears in Collections: | Department of Electronic and Electrical Engineering Research Papers | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| FullText.pdf | Copyright © 2026 The Authors. This work is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/). | 17.91 MB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License