Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/1497
Title: | Incremental training based on input space partitioning and ordered attribute presentation with backward elimination |
Authors: | Guan, SU Ang, J |
Keywords: | Data presentation order;Input space partitioning;Incremental training;Backward elimination;Neural networks |
Issue Date: | 2005 |
Publisher: | Freund & Pettman |
Citation: | Journal of Intelligent Systems. 14 (4) 321-351 |
Abstract: | A neural network training method ID-BT (Incremental Discriminatory Batch Training) is presented in this paper. It separates the input space into two batches: significant and insignificant attributes, and order the attributes within each batch according to their individual discrimination ability before introducing them into the network. By backward eliminating insignificant attributes that are futile, the generalization accuracy of network training is increased. ID-BIT (Incremental Discriminatory Batch and Individual Training) which further improves ID-BT introduces the significant attributes individually and insignificant attributes as a batch. The architecture used for both methods employs some incremental learning algorithms. We tested our algorithm extensively using several widely used benchmark problems, i.e. PROBEN1. The simulation results show that these two methods outperform incremental training with an increasing input dimension (ITID) or conventional batch training i.e. training of neural network without any partitioning of the input space; we are able to achieve better network performance in terms of generalization accuracy. |
URI: | http://bura.brunel.ac.uk/handle/2438/1497 |
DOI: | https://doi.org/10.1515/jisys.2005.14.4.321 |
ISSN: | 0334-1860 |
Appears in Collections: | Electronic and Electrical Engineering Dept of Electronic and Electrical Engineering Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.txt | 523 B | Text | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.