Soulzityr
Soulzityr

Reputation: 456

Neural Networks- Updating the Network

I'm building my first neural network in Java, and I'm following this C++ example online

vector<double> CNeuralNet::Update(vector<double> &inputs)
{

//stores the resultant outputs from each layer

vector<double> outputs;

int cWeight = 0;

//first check that we have the correct amount of inputs

if (inputs.size() != m_NumInputs)
{
    //just return an empty vector if incorrect.
    return outputs;
}
//For each layer....

for (int i=0; i<m_NumHiddenLayers + 1; ++i)
{
    if ( i > 0 )
    {
        inputs = outputs;
    }
outputs.clear();
cWeight = 0;

//for each neuron sum the (inputs * corresponding weights).Throw
//the total at our sigmoid function to get the output.

for (int j=0; j<m_vecLayers[i].m_NumNeurons; ++j)
{
  double netinput = 0;


  int NumInputs = m_vecLayers[i].m_vecNeurons[j].m_NumInputs;


  //for each weight

  for (int k=0; k<NumInputs - 1; ++k)
  {

    //sum the weights x inputs

    netinput += m_vecLayers[i].m_vecNeurons[j].m_vecWeight[k] *

                inputs[cWeight++];
  }


  //add in the bias

  netinput += m_vecLayers[i].m_vecNeurons[j].m_vecWeight[NumInputs-1] *

              CParams::dBias;



  //we can store the outputs from each layer as we generate them.

  //The combined activation is first filtered through the sigmoid

  //function

  outputs.push_back(Sigmoid(netinput, CParams::dActivationResponse));



  cWeight = 0;

}

}

return outputs;

}

I have two questions concerning this code. First, the seemingly... weird assignment of inputs to outputs

//For each layer....

for (int i=0; i<m_NumHiddenLayers + 1; ++i)

{

if ( i > 0 )

{ 

    inputs = outputs;

}
outputs.clear();

This part really confuses me. He just created outputs... why would he be assigning outputs to inputs? Also, why ++i? As far as I can tell, in his code before this he still uses index [0], which is what I'm doing. Why the sudden change? Is there a reason to leave this last one? I understand this might be a hard question to see without the rest of the code examples...

My second question is

//add in the bias

netinput += m_vecLayers[i].m_vecNeurons[j].m_vecWeight[NumInputs-1] *

          CParams::dBias;



//we can store the outputs from each layer as we generate them.

//The combined activation is first filtered through the sigmoid

//function

outputs.push_back(Sigmoid(netinput, CParams::dActivationResponse));

CParams::dBias and CParams::dActivationResponse don't appear anywhere before this. I created two static final globals to sub in for this now. Am I on the right track?

Any help would be appreciated. This is a personal project and I haven't been able to stop thinking about this subject since I first learned about it two weeks ago.

Upvotes: 2

Views: 378

Answers (3)

himanshu1496
himanshu1496

Reputation: 1921

I agree with @kohakukun, and I want to add my answer with his answer, As I see it, output are getting assigned to inputs to calculate the outputs for the next layer of neural network. Sometimes as in network in which i am working on, we can have multiple layers, In my project I have multiple hidden layers, and looking at your code it might be having the similar arrangement here. So I think you can relate our answers to your code and probably it will resolve your doubt to some extent.

Upvotes: 2

Kohakukun
Kohakukun

Reputation: 276

For your first question: you assign the inputs with the just generated output to refine your neural network with backward induction, so it can learn.

For the second question: I think you are on the right track since the bias does not change with each iteration

Upvotes: 0

A Lan
A Lan

Reputation: 425

In for statement, the 3rd part will not be executed before the loop start again, which means, for (int i=0; i<10; ++i) will be exactly do the same as for (int i=0; i<10; i++). Only when i>0 'inputs = outputs;', is it not the correct behavior? CParams should be a class or namespace name, it must exist in somewhere in your whole project. If it is a class name, I think use global static is ok.

Upvotes: 0

Related Questions