Neural Network Toolbox | ![]() ![]() |
Initialization (init)
You can use the function init
to reset the network weights and biases to their original values. Suppose, for instance that you start with the network
net = newp([-2 2;-2 +2],1);
wts = net.IW{1,1}
wts = 0 0
In the same way, you can verify that the bias is 0 with
bias = net.b{1}
bias = 0
Now set the weights to the values 3 and 4 and the bias to the value 5 with
net.IW{1,1} = [3,4]; net.b{1} = 5;
Recheck the weights and bias as shown above to verify that the change has been made. Sure enough,
wts = 3 4 bias = 5
Now use init
to reset the weights and bias to their original values.
net = init(net);
We can check as shown above to verify that.
wts = 0 0 bias = 0
We can change the way that a perceptron is initialized with init
. For instance, we can redefine the network input weights and bias initFcns
as rands
,
and then apply init
as shown below.
net.inputweights{1,1}.initFcn = 'rands'; net.biases{1}.initFcn = 'rands'; net = init(net);
Now check on the weights and bias.
wts = 0.2309 0.5839 biases = -0.1106
We can see that the weights and bias have been given random numbers.
You may want to read more about init
in Advanced Topics in Chapter 12.
![]() | Simulation (sim) | Learning Rules | ![]() |