Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/28828
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAlmatrouk, B-
dc.contributor.authorMeng, H-
dc.contributor.authorSwash, MR-
dc.date.accessioned2024-04-20T16:01:53Z-
dc.date.available2024-04-20T16:01:53Z-
dc.date.issued2024-04-15-
dc.identifierORCiD: Bodor Almatrouk https://orcid.org/0009-0002-9041-2115-
dc.identifierORCiD: Hongying Meng https://orcid.org/0000-0002-8836-1382-
dc.identifierORCiD: Mohammad Rafiq Swash https://orcid.org/0000-0003-4242-7478-
dc.identifier3335-
dc.identifier.citationAlmatrouk, B., Meng, H. and Swash, M.R..(2024) 'Holoscopic Elemental-Image-Based Disparity Estimation Using Multi-Scale, Multi-Window Semi-Global Block Matching', Applied Sciences, 14 (8), 3335, pp. 1 - 23. doi: 10.3390/app14083335.en_US
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/28828-
dc.descriptionData Availability Statement: The data presented in this study are available on request from the corresponding author, Bodor Almatrouk, at bodor.almatrouk@brunel.ac.uk. The data are not publicly available due to commercial privacy.en_US
dc.description.abstractIn Holoscopic imaging, a single aperture is used to acquire full-colour spatial images like a fly’s eye by gently altering angles between nearby lenses with a micro-lens array. Due to its simple data collection and visualisation methods, which provide robust and scalable spatial information, and its motion parallax, binocular disparity, and convergence, this technique may be able to overcome traditional 2D imaging issues like depth, scalability, and multi-perspective problems. A novel disparity-map-generating method uses angular information from a single Holoscopic image’s micro-images, or Elemental Images (EIs), to create a scene’s disparity map. Not much research has used EIs instead of Viewpoint Images (VPIs) for disparity estimation. This study investigates whether angular perspective data may replace spatial orthographic data. Using noise reduction and contrast enhancement, EIs with a low resolution and lack of texture are pre-processed to calculate the disparity. The Semi-Global Block Matching (SGBM) technique is used to calculate the disparity between EI pixels. A multi-resolution approach overcomes EIs’ resolution constraints, and a content-aware analysis dynamically modifies the SGBM window size settings to generate disparities across different texture and complexity levels. A background mask and nearby EIs with accurate backgrounds detect and rectify EIs with erroneous backgrounds. Our method generates disparity maps that outperform two state-of-the-art deep learning algorithms and VPIs in real images.en_US
dc.description.sponsorshipThis research received no external funding.en_US
dc.format.extent1 - 23-
dc.languageEnglish-
dc.language.isoen_USen_US
dc.publisherMDPIen_US
dc.rightsCopyright © 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.subjectholoscopicen_US
dc.subjectelemental imagesen_US
dc.subjectviewpoint imagesen_US
dc.subjectmicro-lensesen_US
dc.subjectdisparityen_US
dc.subjectSGBMen_US
dc.titleHoloscopic Elemental-Image-Based Disparity Estimation Using Multi-Scale, Multi-Window Semi-Global Block Matchingen_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.3390/app14083335-
dc.relation.isPartOfApplied Sciences-
pubs.issue8-
pubs.publication-statusPublished online-
pubs.volume14-
dc.identifier.eissn2076-3417-
dc.rights.licensehttps://creativecommons.org/licenses/by/4.0/legalcode.en-
dc.rights.holderThe authors-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).30.72 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons