Andrii Tiertyshnyi
Andrii Tiertyshnyi

Reputation: 52

Parallel neural network

I am trying to make parallel neural network in such way:

  1. Create network and training set
  2. Divide the training set in N pieces(one set per thread)
  3. Send copy of network and part of training data to each thread
  4. Train network on each thread
  5. Combine network neurons weights from the N networks(from each thread)
  6. If not end conditions goto 3.

My question is: how can I combine the neural network weights into one?

Upvotes: 2

Views: 357

Answers (2)

usr
usr

Reputation: 171188

What Google does is they have each thread/node train only a subset of the neurons. Then, it's fairly easy to combine them back because each neuron was written only once.

You have to combine them regularly, though, so that they do not drift apart too much.

The unit that they use to partition the net is a "column" of neurons.

Upvotes: 0

klubow
klubow

Reputation: 126

Interesting approach, I think that there could be 2 ways: - average value of weights - weights weighted by inverse RMSE/MSE

Upvotes: 0

Related Questions