Reputation: 499
I've just started look into recurrent neural network. I found three sources of information on Elman's network(Elman 1991).
(Example and code) http://mnemstudio.org/neural-networks-elman.htm
(Paper) http://www.sysc.pdx.edu/classes/Werbos-Backpropagation%20through%20time.pdf
(Q&A) Elman and Jordan context values during training for neural network
According to the first resource, the weights from hidden to context / context to hidden layers are not updated.
From the second resource, it also set these updates to 0, which means it doesn't updates the weights.
But from the third resource on Stackoverflow, the user claimed that "The context neurons neuron values themselves are not updated as training progresses. The weights between them and the next layer ARE updated during training. "
I understand the context neuron saves the value of hidden neural at time t, and feed it (together with input neuron) to hidden neuron at t + 1. But do we have to update the weights in between?
Upvotes: 3
Views: 4528
Reputation: 61
I am not sure if this question still is important, but here's my interpretation:
The weights from hidden layer to context layer are fixed at 1. Those do not get updated.
However, the weights from context layer back to hidden layer will get updated. How else would the network otherwise learn what to do with past values? If they would not change, what would be the right value to initialise them with? Surely not 1.
And the values of the context neurons WILL get updated during training. Not through the use of a sigmoid function of some sort, but by simply copying the values of the hidden layer.
Upvotes: 6