Please use this identifier to cite or link to this item:
|Title:||Deep Bayesian Supervised Learning given Hypercuboidally-shaped, Discontinuous Data, using Compound Tensor-Variate & Scalar-Variate Gaussian Processes|
|Keywords:||compound tensor-variate scalar-variate GPs;covariance kernel parametrisation;Lipschitz continuity;deep learning|
|Abstract:||We undertake Bayesian learning of the high-dimensional functional relationship between a system parameter vector and an observable, that is in general tensor-valued. The ultimate aim is Bayesian inverse prediction of the system parameters, at which test data is recorded. We attempt such learning given hypercuboidally-shaped data that displays strong discontinuities, rendering learning challenging. We model the sought high-dimensional function, with a tensor-variate Gaussian Process (GP), and use three independent ways for learning covariance matrices of the resulting likelihood, which is Tensor-Normal. We demonstrate that the discontinuous data demands that implemented covariance kernels be non-stationary--achieved by modelling each kernel hyperparameter, as a function of the sample function of the invoked tensor-variate GP. Each such function can be shown to be temporally-evolving, and treated as a realisation from a distinct scalar-variate GP, with covariance described adaptively by collating information from a historical set of samples of chosen sample-size. We prove that deep-learning using 2-"layers", suffice, where the outer-layer comprises the tensor-variate GP, compounded with multiple scalar-variate GPs in the "inner-layer", and undertake inference with Metropolis-within-Gibbs. We apply our method to a cuboidally-shaped, discontinuous, real dataset, and subsequently perform forward prediction to generate data from our model, given our results--to perform model-checking.|
|Appears in Collections:||Dept of Mathematics Research Papers|
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.