BATCHgdwm.MLPnet {AMORE}R Documentation

Batch gradient descent with momentum training

Description

Modifies the neural network weights and biases according to the training set.

Usage

BATCHgdwm.MLPnet(net,P,T, n.epochs)

Arguments

net Neural Network to train.
P Input data set.
T Target output data set.
n.epochs Number of epochs to train

Value

This functions returns a neural network object modified according to the chosen data.

Author(s)

Manuel Castejón Limas. manuel.castejon@unileon.es
Joaquin Ordieres Meré. joaquin.ordieres@dim.unirioja.es
Ana González Marcos. ana.gonzalez@unileon.es
Alpha V. Pernía Espinoza. alpha.pernia@alum.unirioja.es
Eliseo P. Vergara Gonzalez. eliseo.vergara@dim.unirioja.es
Francisco Javier Martinez de Pisón. francisco.martinez@dim.unirioja.es
Fernando Alba Elías. fernando.alba@unavarra.es

References

Simon Haykin. Neural Networks – a Comprehensive Foundation. Prentice Hall, New Jersey, 2nd edition, 1999. ISBN 0-13-273350-1.

See Also

newff,train,BATCHgd.MLPnet


[Package AMORE version 0.2-11 Index]