Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/26188
Title: A Latent Encoder Coupled Generative Adversarial Network (LE-GAN) for Efficient Hyperspectral Image Super-Resolution
Authors: Shi, Y
Han, L
Han, L
Chang, S
Hu, T
Dancey, D
Keywords: hyperspectral image super-resolution;generative adversarial network;deep learning.
Issue Date: 25-Jul-2022
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Shi, Y. et al. (2022) 'A Latent Encoder Coupled Generative Adversarial Network (LE-GAN) for Efficient Hyperspectral Image Super-Resolution', IEEE Transactions on Geoscience and Remote Sensing, 60, 5534819, pp. 1 - 19. doi: 10.1109/TGRS.2022.3193441.
Abstract: Realistic hyperspectral image (HSI) super-resolution (SR) techniques aim to generate a high-resolution (HR) HSI with higher spectral and spatial fidelity from its low-resolution (LR) counterpart. The generative adversarial network (GAN) has proven to be an effective deep learning framework for image SR. However, the optimization process of existing GAN-based models frequently suffers from the problem of mode collapse, leading to the limited capacity of spectral-spatial invariant reconstruction. This may cause the spectral-spatial distortion to the generated HSI, especially with a large upscaling factor. To alleviate the problem of mode collapse, this work has proposed a novel GAN model coupled with a latent encoder (LE-GAN), which can map the generated spectral-spatial features from the image space to the latent space and produce a coupling component to regularize the generated samples. Essentially, we treat an HSI as a high-dimensional manifold embedded in a latent space. Thus, the optimization of GAN models is converted to the problem of learning the distributions of HR HSI samples in the latent space, making the distributions of the generated SR HSIs closer to those of their original HR counterparts. We have conducted experimental evaluations on the model performance of SR and its capability in alleviating mode collapse. The proposed approach has been tested and validated based on two real HSI datasets with different sensors (i.e., AVIRIS and UHD-185) for various upscaling factors (i.e., $\times 2$ , $\times 4$ , and $\times 8$ ) and added noise levels (i.e., $\infty $ , 40, and 80 dB) and compared with the state-of-the-art SR models (i.e., hyperspectral coupled network (HyCoNet), low tensor-train rank (LTTR), band attention GAN (BAGAN), SR-GAN, and WGAN). Experimental results show that the proposed model outperforms the competitors on the SR quality, robustness, and alleviation of mode collapse. The proposed approach is able to capture spectral and spatial details and generate more faithful samples than its competitors. It has also been found that the proposed model is more robust to noise and less sensitive to the upscaling factor and has been proven to be effective in improving the convergence of the generator and the spectral-spatial fidelity of the SR HSIs.
URI: https://bura.brunel.ac.uk/handle/2438/26188
DOI: https://doi.org/10.1109/TGRS.2022.3193441
ISSN: 0196-2892
Other Identifiers: ORCID iDs: Yue Shi https://orcid.org/0000-0001-8424-6996; Liangxiu Han https://orcid.org/0000-0003-2491-7473; Lianghao Han https://orcid.org/0000-0001-8672-1017; Sheng Chang https://orcid.org/0000-0001-7870-7047; Darren Dancey https://orcid.org/0000-0001-7251-8958.
5534819
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2022 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. See: https://www.ieee.org/publications/rights/rights-policies.html4.68 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.