Yumianzi
Yumianzi

Reputation: 11

Why do I get different neural network training results each time even the initial weights are the same?

I know that the results will be different if the initial weights and bias are random, so I used the Genetic Algorithm to optimize the structure of BP neural network and set up the initial weights and bias given by GA before the training. I do the work in Matlab R2014a, and my code is as follow:

clc
clear all;

LoopTime = 100; 
NetAll = cell(1,LoopTime);
MatFileToSave = sprintf('BPTraining_%4d.mat',LoopTime);
input_train = xlsread('test.xls','D1:F40')';
output_train = xlsread('test.xls','H1:H40')';

[inputn,inputps] = mapminmax(input_train);
[outputn,outputps] = mapminmax(output_train);

A=[];

if ~exist(MatFileToSave,'file')
for  ii = 1:LoopTime
    net.divideFcn = 'dividerand';
    net.divideMode = 'sample';
    net=newff(inputn,outputn,7,{'tansig'},'trainlm');
    net.divideParam.trainRatio = 70/100;
    net.divideParam.valRatio = 30/100;
    net.divideParam.testRatio = 0/100;
    net.trainParam.epochs=2000;
    net.trainParam.lr=0.1;
    net.trainParam.goal=0.00001;

    net.iw{1,1} = [0.56642385,-0.044929342,2.806006491;
    -0.129892602,2.969433103,-0.056528269;
    0.200067228,-1.074037985,-0.90233406;
    -0.794299829,-2.202876191,0.346403187;
    0.083438759,1.246476813,1.788348379;
    0.889662621,1.024847111,2.428373515;
    -1.24788069,1.383238864,-1.313847905];   
    net.b{1} = [-1.363912639;-1.978352461;-0.036013077;0.135126212;1.995020537;-0.223083372;-1.013341625];   
    net.lw{2,1} = [-0.412881802 -0.146069773    1.711325447 -1.091444059    -2.069737603    0.765038862 -2.825474689];      
    net.b{2} = [-2.182832342];      

    [net,tr]=train(net,inputn,outputn);

    yyn = sim(net,inputn);
    yy = mapminmax('reverse',yyn,outputps);
    regre = min(corrcoef(yy,output_train));
    error = (yy-output_train)/output_train ;
    rmse = std(yy);

    A = [A;ii,regre,error,rmse];

    NetAll{ii} = net;

    clear net;

    figure
    plotregression(output_train,yy,'Regression');

    forder = 'regre_tr';
    if ~exist(forder,'dir');
    mkdir(forder);
    end
    picstr = [ forder '\regre_' num2str(ii)];
    print('-dpng','-r100',picstr);
    close

end 

save(MatFileToSave,'NetAll');

xlswrite('BPTraining_100.xls',A); 
end

I wrote a 'for-end' loop to check if the results each time are the same, but the regression coeffcient varies from 0.8 to 0.98 and is never the same as expected.

So, my question is:

  1. Is my code for setting up the initial weights correct? If not, how to set up?
  2. If correct, why the results are still different?

Upvotes: 0

Views: 811

Answers (1)

Finn
Finn

Reputation: 2343

It is exactly like @Neil Slater said: somewhere in the deviderand function is a random part and if you dont fix the randomness of it the results may vary. Try:

[a,b,c]=dividerand(10,0.7,0.15,0.15)

multiple times and the results change. You can either go for a different 'net.divideMode' like Neil suggested or can create your own dividerand function with fixed random seed (i skip reflecting the fact that a fixed dividerand isin't really random anymore):

open dividerand %opens the matlab function
%change the name of the function in line 1 
function [out1,out2,out3,out4,out5,out6] = mydividerand(in1,varargin)
%go to line 105 and add
rng(1); fixing the random seed
allInd = randperm(Q); %previous line 105 that is the random part of the function
%use 'save as' and save it to your workfolder

Please keep in mind that it is not wise to change the original 'dividerand', so maybe go for the 'save as' before you touch anything. Also the new function name should be different and unique (like mydividerand)

[a,b,c]=mydividerand(10,0.7,0.15,0.15)%always the same result

Upvotes: 3

Related Questions