Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/7727
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorYu, K-
dc.contributor.authorAl-Kenani, Ali J Kadhim-
dc.date.accessioned2013-11-28T11:24:45Z-
dc.date.available2013-11-28T11:24:45Z-
dc.date.issued2013-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/7727-
dc.descriptionThis thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel Universityen_US
dc.description.abstractThe aim of the work in this thesis is to carry out dimension reduction (DR) for high dimensional (HD) data by using statistical methods for variable selection, feature extraction and a combination of the two. In Chapter 2, the DR is carried out through robust feature extraction. Robust canonical correlation (RCCA) methods have been proposed. In the correlation matrix of canonical correlation analysis (CCA), we suggest that the Pearson correlation should be substituted by robust correlation measures in order to obtain robust correlation matrices. These matrices have been employed for producing RCCA. Moreover, the classical covariance matrix has been substituted by robust estimators for multivariate location and dispersion in order to get RCCA. In Chapter 3 and 4, the DR is carried out by combining the ideas of variable selection using regularisation methods with feature extraction, through the minimum average variance estimator (MAVE) and single index quantile regression (SIQ) methods, respectively. In particular, we extend the sparse MAVE (SMAVE) reported in (Wang and Yin, 2008) by combining the MAVE loss function with different regularisation penalties in Chapter 3. An extension of the SIQ of Wu et al. (2010) by considering different regularisation penalties is proposed in Chapter 4. In Chapter 5, the DR is done through variable selection under Bayesian framework. A flexible Bayesian framework for regularisation in quantile regression (QR) model has been proposed. This work is different from Bayesian Lasso quantile regression (BLQR), employing the asymmetric Laplace error distribution (ALD). The error distribution is assumed to be an infinite mixture of Gaussian (IMG) densities.en_US
dc.language.isoenen_US
dc.publisherBrunel University, School of Information Systems, Computing and Mathematics-
dc.relation.ispartofSchool of Information Systems, Computing and Mathematics-
dc.subjectDimension reductionen_US
dc.subjectVariable selectionen_US
dc.subjectLassoen_US
dc.subjectAdaptive lassoen_US
dc.subjectQuantile regressionen_US
dc.titleSome statistical methods for dimension reductionen_US
dc.typeThesisen_US
Appears in Collections:Mathematical Physics
Dept of Mathematics Theses

Files in This Item:
File Description SizeFormat 
FulltextThesis.pdf2.08 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.