Main Content

trains

Sequential order incremental training with learning functions

Syntax

net.trainFcn = 'trains'
[net,tr] = train(net,...)

Description

trains is not called directly. Instead it is called by train for networks whose net.trainFcn property is set to 'trains', thus:

net.trainFcn = 'trains' sets the network trainFcn property.

[net,tr] = train(net,...) trains the network with trains.

trains trains a network with weight and bias learning rules with sequential updates. The sequence of inputs is presented to the network with updates occurring after each time step.

This incremental training algorithm is commonly used for adaptive applications.

Training occurs according to trains training parameters, shown here with their default values:

net.trainParam.epochs

1000

Maximum number of epochs to train

net.trainParam.goal0

Performance goal

net.trainParam.show

25

Epochs between displays (NaN for no displays)

net.trainParam.showCommandLine

false

Generate command-line output

net.trainParam.showWindow

true

Show training GUI

net.trainParam.time

Inf

Maximum time to train in seconds

Network Use

You can create a standard network that uses trains for adapting by calling perceptron or linearlayer.

To prepare a custom network to adapt with trains,

  1. Set net.adaptFcn to 'trains'. This sets net.adaptParam to trains’s default parameters.

  2. Set each net.inputWeights{i,j}.learnFcn to a learning function. Set each net.layerWeights{i,j}.learnFcn to a learning function. Set each net.biases{i}.learnFcn to a learning function. (Weight and bias learning parameters are automatically set to default values for the given learning function.)

To allow the network to adapt,

  1. Set weight and bias learning parameters to desired values.

  2. Call adapt.

See help perceptron and help linearlayer for adaption examples.

Algorithms

Each weight and bias is updated according to its learning function after each time step in the input sequence.

Version History

Introduced before R2006a

See Also

| | |