Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/1820
Title: An incremental approach to contribution-based feature selection
Authors: Guan, SU
Liu, J
Qi, Y
Keywords: Feature selection;Neural network;Incremental training;Knock-out
Issue Date: 2004
Publisher: Freund & Pettman
Citation: Journal of Intelligent Systems. 13 (1) 15-44
Abstract: This paper presents a novel feature selection approach based on an incremental neural network (NN) training approach. Instead of training input attributes in batch, this incremental approach trains input attributes one by one so that network performance keeps refined when each new attribute comes in. If an incoming attribute is consistent with previous attributes and relevant to output attributes, network performance will be improved, otherwise degraded. The contribution of an input attribute is evaluated through network performance evaluation. Attributes with little or no contribution will be discarded. In order to have fair feature selection, the individual discrimination ability of each attribute is evaluated before training by using a NN with only one attribute in the input layer. The attribute with the best discrimination ability will be introduced first, followed by those attributes with lower discrimination ability. Two feature-detection methods are discussed based on this incremental training approach. Unlike existing feature selection methods, the proposed feature selection methods are suitable not only for classification problems, but also for regression problems. Experimental results show that our methods work well on several benchmark problems and NN accuracy improved after feature selection.
URI: http://bura.brunel.ac.uk/handle/2438/1820
ISSN: 0334-1860
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
An Incremental Approach to Contribution-Based Feature Selection.txt290 BTextView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.