Please use this identifier to cite or link to this item:
Title: Feature selection for modular networks based on incremental training
Authors: Guan, SU
Liu, J
Keywords: Feature selection;Classifier;Neural network;Feedforward neural network;Incremental training;Input attribute
Issue Date: 2005
Publisher: Freund & Pettman
Citation: Journal of Intelligent Systems. 14 (4) 353-383
Abstract: Feature selection plays an important role in classification systems. Using classifier error rate as the evaluation function, feature selection is integrated with incremental training. A neural network classifier is implemented with an incremental training approach to detect and discard irrelevant features. By learning attributes one after another, our classifier can find out directly the attributes that make no contribution to classification. These attributes are marked and considered for removal. Incorporated with an FLD feature ranking scheme, three batch removal methods based on classifier error rate have been developed to discard irrelevant features. These feature selection methods reduce the computational complexity involved in searching among a large number of possible solutions significantly. Experimental results show that our feature selection method works well on several benchmark problems. The selected subsets are further validated by a Constructive Backpropagation (CBP) classifier, which confirms increased classification accuracy and reduced training cost.
ISSN: 0334-1860
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Feature Selection for Modular Neural Network Classifiers.txt275 BTextView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.