You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.
|
|
Growing Neural Networks
Growing neural networks very much resemble the forward
selection technique with multiple linear regression. The principal
goal of growing neural networks is to perform a feature selection during
the growing process.
The method starts with a neural network having only one input neuron.
Then each single feature is selected, one after another, and the network
is trained and evaluated using this very feature. The one feature which
leads to the best results is stored and attached to the first neuron of
the input layer after the processing of all features has been completed.
After this, the network grows in its input layer by one neuron and the
selection process is repeated the same way as described above. Thus, the
best features of the previous runs are combined with a new feature which
gives the largest increase of model performance. In order to prevent the
network from multiple selection of a feature, those features which already
have been selected are omitted for the rest of the selection process. The
process of the growing network is aborted when a test data set does not
show any increase in model performance.
Growing networks are very demanding, as far as computing power is concerned,
since each single step of feature selection requires full training of the
neural network. In fact, growing networks are not feasible with training
techniques such as back propagation.
However, they can be computed with fast networks such as RBF
nets.
Last Update: 2006-Jän-17