Reputation: 303
I want to implement broadcast some values from chief to all workers with distributed TensorFlow like MPI's bcast: https://mpi4py.readthedocs.io/en/stable/tutorial.html#collective-communication
I guess broadcast_send or tf.raw_ops.CollectiveBcastSend is the operation, but I cloud not found any examples on TensorFlow official document.
Is there a good example to use such the row level distributed operations?
Upvotes: 1
Views: 119