Please use this identifier to cite or link to this item:
Title: Incremental training based on input space partitioning and ordered attribute presentation with backward elimination
Authors: Guan, SU
Ang, J
Keywords: Data presentation order;Input space partitioning;Incremental training;Backward elimination;Neural networks
Issue Date: 2005
Publisher: Freund & Pettman
Citation: Journal of Intelligent Systems. 14 (4) 321-351
Abstract: A neural network training method ID-BT (Incremental Discriminatory Batch Training) is presented in this paper. It separates the input space into two batches: significant and insignificant attributes, and order the attributes within each batch according to their individual discrimination ability before introducing them into the network. By backward eliminating insignificant attributes that are futile, the generalization accuracy of network training is increased. ID-BIT (Incremental Discriminatory Batch and Individual Training) which further improves ID-BT introduces the significant attributes individually and insignificant attributes as a batch. The architecture used for both methods employs some incremental learning algorithms. We tested our algorithm extensively using several widely used benchmark problems, i.e. PROBEN1. The simulation results show that these two methods outperform incremental training with an increasing input dimension (ITID) or conventional batch training i.e. training of neural network without any partitioning of the input space; we are able to achieve better network performance in terms of generalization accuracy.
ISSN: 0334-1860
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Closed Access - Guan et al - Journal of Intelligent Systems 2005.txt523 BTextView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.