Reputation: 1768
I am working on a simple python script to test mpi4py. Specifically, I want to broadcast a scalar and an array from a given processor (say rank 0
), so that all other processors have access to the values of the broadcasted scalar and the array in subsequent steps.
This is what I have done till now:
from __future__ import division
from mpi4py import MPI
import numpy as np
comm = MPI.COMM_WORLD
nproc = comm.Get_size()
rank = comm.Get_rank()
if rank==0:
scal = 55.0
mat = np.array([[1,2,3],[4,5,6],[7,8,9]])
arr = np.ones(5)
result = 2*arr
comm.bcast([ result , MPI.DOUBLE], root=0)
comm.bcast( scal, root=0)
comm.bcast([ mat , MPI.DOUBLE], root=0)
for proc in range(1, 3):
if (rank == proc):
print "Rank: ", rank, ". Array is: ", result
print "Rank: ", rank, ". Scalar is: ", scal
print "Rank: ", rank, ". Matrix is: ", mat
But, I get the following errors:
NameError: name 'mat' is not defined
print "Rank: ", rank, ". Matrix is: ", mat
Also, in my output (print "Rank: ", rank, ". Scalar is: ", scal
and print "Rank: ", rank, ". Array is: ", arr
), I do not see the values of scal
and array
. What am I missing here? I will really appreciate any help.
Upvotes: 2
Views: 5093
Reputation: 758
I see here two errors:
scal
and your numpy arrays mat
, arr
and results
are only defined on rank 0. They should be defined on all the MPI ranks. Indeed, as the data are broadcasted on all the ranks, the variables and the Numpy arrays must be allocated to store the received results.bcast
is intended for Python objects, and are pickled (e.g. serialized) in order to be sent. Bcast
is intended for Numpy array. So use the different calls accordingly with respect to what you are sending/receiving. Moreover, they have to be called on all the ranks.As I am using Python 3, I have also corrected the print
calls. Yet, you should not notice any issue with Python 2 due to the print_function
import from future
Finally, I advise you to have a look at the MPI4Py tutorials here: http://mpi4py.scipy.org/docs/usrman/tutorial.html. I think they cover a large spectrum of what you may do with MPI4Py.
Here is something working:
from __future__ import division, print_function
from mpi4py import MPI
import numpy as np
comm = MPI.COMM_WORLD
nproc = comm.Get_size()
rank = comm.Get_rank()
scal = None
mat = np.empty([3,3], dtype='d')
arr = np.empty(5, dtype='d')
result = np.empty(5, dtype='d')
if rank==0:
scal = 55.0
mat[:] = np.array([[1,2,3],[4,5,6],[7,8,9]])
arr = np.ones(5)
result = 2*arr
comm.Bcast([ result , MPI.DOUBLE], root=0)
scal = comm.bcast(scal, root=0)
comm.Bcast([ mat , MPI.DOUBLE], root=0)
print("Rank: ", rank, ". Array is:\n", result)
print("Rank: ", rank, ". Scalar is:\n", scal)
print("Rank: ", rank, ". Matrix is:\n", mat)
Upvotes: 8