Reputation: 52
I am trying to make parallel neural network in such way:
My question is: how can I combine the neural network weights into one?
Upvotes: 2
Views: 357
Reputation: 171188
What Google does is they have each thread/node train only a subset of the neurons. Then, it's fairly easy to combine them back because each neuron was written only once.
You have to combine them regularly, though, so that they do not drift apart too much.
The unit that they use to partition the net is a "column" of neurons.
Upvotes: 0
Reputation: 126
Interesting approach, I think that there could be 2 ways: - average value of weights - weights weighted by inverse RMSE/MSE
Upvotes: 0