Reputation: 453
I have two laptops and want to use both for the DL model training. I don't have any experience in distributed systems and want to know is it possible to use the processing power of two laptops together to train a single model. What about tf.distribute.experimental.ParameterServerStrategy
? Will it be of any use?
Upvotes: 2
Views: 2702
Reputation:
Yes, you can use multiple devices for training your model and you need to have cluster and worker configuration to be done on both the devices like below.
tf_config = {
'cluster': {
'worker': ['localhost:12345', 'localhost:23456']
},
'task': {'type': 'worker', 'index': 0}
}
This Tutorial from Tensorflow on Multi-worker training with Keras will show you all the details about the configuration and training your model.
Hope this answers your question.
Upvotes: 1