Neural Network Toolbox | ![]() ![]() |
Creating an Elman Network (newelm)
An Elman network with two or more layers can be created with the function newelm
. The hidden layers commonly have tansig
transfer functions, so that is the default for newelm
. As shown in the architecture diagram, purelin is commonly the output-layer transfer function.
The default backpropagation training function is trainbfg
. One might use trainlm
, but it tends to proceed so rapidly that it does not necessarily do well in the Elman network. The backprop weight/bias learning function default is learngdm
, and the default performance function is mse.
When the network is created, each layer's weights and biases are initialized with the Nguyen-Widrow layer initialization method implemented in the function initnw
.
Now consider an example. Suppose that we have a sequence of single-element input vectors in the range from 0 to 1. Suppose further that we want to have five hidden-layer tansig neurons and a single logsig output layer. The following code creates the desired network.
net = newelm([0 1],[5 1],{'tansig','logsig'});
Simulation
Suppose that we want to find the response of this network to an input sequence of eight digits that are either 0 or 1.
P = round(rand(1,8)) P = 0 1 0 1 1 0 0 0
Recall that a sequence to be presented to a network is to be in cell array form. We can convert P to this form with
Pseq = con2seq(P) Pseq = [0] [1] [0] [1] [1] [0] [0] [0]
Now we can find the output of the network with the function sim
.
Y = sim(net,Pseq) Y = Columns 1 through 5 [1.9875e-04] [0.1146] [5.0677e-05] [0.0017] [0.9544] Columns 6 through 8 [0.0014] [5.7241e-05] [3.6413e-05]
We convert this back to concurrent form with
z = seq2con(Y);
and can finally display the output in concurrent form with
z{1,1} ans = Columns 1 through 7 0.0002 0.1146 0.0001 0.0017 0.9544 0.0014 0.0001 Column 8 0.0000
Thus, once the network is created and the input specified, one need only call sim
.
![]() | Elman Networks | Training an Elman Network | ![]() |