Reputation: 413
I have the following basic MPI program written in Fortran 90:
program sendRecv
include 'mpif.h'
!MPI Variables
integer ierr, numProcs, procID
!My variables
integer dat, datRec
!Init MPI
call MPI_INIT ( ierr )
!Get number of processes/ cores requested
call MPI_COMM_SIZE (MPI_COMM_WORLD, numProcs, ierr)
!Get rank of process
call MPI_COMM_RANK (MPI_COMM_WORLD, procID, ierr)
if (procID .eq. 0) then
dat=4
!Send num to process 1
call MPI_SEND (dat, 1, MPI_INT, 1, 0, MPI_COMM_WORLD, ierr)
else if (procID .eq. 1) then
!Recieve num from process 0
call MPI_RECV (datRec, 1, MPI_INT, 0, MPI_ANY_SOURCE, MPI_COMM_WORLD, MPI_STATUS_SIZE, ierr)
!Display info
write(*,*) "Process 1 recieved ", datRec, " from proc 0"
else
write(*,*)"Into else"
end if
!Finilise MPI
call MPI_FINALIZE ( ierr )
end program sendRecv
The purpose is just to send an integer from process 0 and receive and display it in process 1, but whatever i seem to try, i cannot get it to work.
I am compiling and running this program with:
mpif90 sendRecv.f90 -o tst
mpirun -n 2 tst
and am getting this:
[conor-Latitude-XT2:3053] *** An error occurred in MPI_Send
[conor-Latitude-XT2:3053] *** on communicator MPI_COMM_WORLD
[conor-Latitude-XT2:3053] *** MPI_ERR_TYPE: invalid datatype
[conor-Latitude-XT2:3053] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 3054 on
node conor-Latitude-XT2 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[conor-Latitude-XT2:03052] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[conor-Latitude-XT2:03052] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I have looked around but i just cant spot my error. Any help would be great, thanks!
Upvotes: 1
Views: 2578
Reputation: 74455
MPI_INT
corresponds to the int
type of C and C++. The Fortran INTEGER
type is represented by the predefined MPI datatype MPI_INTEGER
.
Besides there is another error in your code. You pass MPI_STATUS_SIZE
whereas you have to pass an integer array of the same size, e.g.:
INTEGER status(MPI_STATUS_SIZE)
CALL MPI_SEND(..., status, ierr)
I would also recommend that you replace
include 'mpif.h'
with
use mpi
mpif.h
is an obsoleted Fortran 77 interface and it should not be used in modern programs. The mpi
module interface itself is also obsoleted by the mpi_f08
module interface, but that comes from MPI-3.0 and is still not widely implemented.
Upvotes: 2