Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/25452
Title: Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification
Authors: Lai, Y
Guan, W
Luo, L
Guo, Y
Song, H
Meng, H
Keywords: extended stochastic variational inference;mixture models;Bayesian estimation;text categrization;network traffiic classification;misuse intrusion detecton
Issue Date: 25-Oct-2022
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Lai, Y..et al. (2024) 'Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification', IEEE Transactions on Neural Networks and Learning Systems, 35 (5), pp. 6948 - 6962. doi: 10.1109/tnnls.2022.3213518
Abstract: The finite inverted beta mixture model (IBMM) has been proven to be efficient in modeling positive vectors. Under the traditional variational inference framework, the critical challenge in Bayesian estimation of the IBMM is that the computational cost of performing inference with large datasets is prohibitively expensive, which often limits the use of Bayesian approaches to small datasets. An efficient alternative provided by the recently proposed stochastic variational inference (SVI) framework allows for efficient inference on large datasets. Nevertheless, when using the SVI framework to address the non-Gaussian statistical models, the evidence lower bound (ELBO) cannot be explicitly calculated due to the intractable moment computation. Therefore, the algorithm under the SVI framework cannot directly use stochastic optimization to optimize the ELBO, and an analytically tractable solution cannot be derived. To address this problem, we propose an extended version of the SVI framework with more flexibility, namely, the extended SVI (ESVI) framework. This framework can be used in many non-Gaussian statistical models. First, some approximation strategies are applied to further lower the ELBO to avoid intractable moment calculations. Then, stochastic optimization with noisy natural gradients is used to optimize the lower bound. The excellent performance and effectiveness of the proposed method are verified in real data evaluation.
URI: https://bura.brunel.ac.uk/handle/2438/25452
DOI: https://doi.org/10.1109/tnnls.2022.3213518
ISSN: 2162-237X
Other Identifiers: ORCiD: Yuping Lai https://orcid.org/0000-0002-3797-1228
ORCiD: Wenbo Guan https://orcid.org/0000-0002-4645-6121
ORCiD: Lijuan Luo https://orcid.org/0000-0002-3702-372X
ORCiD: Heping Song https://orcid.org/0000-0002-8583-2804
ORCiD: Hongying Meng https://orcid.org/0000-0002-8836-1382
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2022 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/23.14 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.