smtnkc
smtnkc

Reputation: 508

MPI calloc Causes Segmentation Fault

I wrote a program to find sum of array elements by MPI. Both root and workers finds the sum of a portion and workers sends partial sums to root at the end. When I try with a static sized array there isn't any problem. But it gives segmentation fault if I use calloc. The source code is given below:

#include <stdio.h>
#include <stdlib.h>
#include <mpi.h>
#define tag1 1 /* send from root to workers */
#define tag2 2 /* send from workers to root */
#define root 0
#define n_data 12

int main(int argc, char *argv[]) 
{ 
    int total_sum, partial_sum;
    int my_id, i, n_procs, n_portion;

    MPI_Init(&argc, &argv);
    MPI_Status status;
    MPI_Comm_rank(MPI_COMM_WORLD, &my_id);
    MPI_Comm_size(MPI_COMM_WORLD, &n_procs);
    n_portion=n_data/n_procs;

    int *array = (int *)calloc(n_data, sizeof(int));
    int *local  = (int *)calloc(n_portion, sizeof(int));

    if(my_id == root) { 

        /* initialize array */
        for(i = 0; i < n_data; i++) 
            array[i]=i;

        /* send a portion of the array to each worker */
        for(i= 1; i < n_procs; i++) 
            MPI_Send( &array[i*n_portion], n_portion, MPI_INT,i, tag1, MPI_COMM_WORLD); 

        /* calculate the sum of my portion */
        for(i = 0; i < n_portion; i++)
            total_sum += array[i];

        /* collect the partial sums from workers */
        for(i= 1; i < n_procs; i++) {
            MPI_Recv( &partial_sum, 1, MPI_INT, MPI_ANY_SOURCE,tag2, MPI_COMM_WORLD, &status);
            total_sum += partial_sum; 
        }

        printf("The total sum is: %d\n", total_sum);
    }
    else { /* I am a worker, receive data from root */

        MPI_Recv( &local, n_portion, MPI_INT, root, tag1, MPI_COMM_WORLD, &status);

        /* Calculate the sum of my portion of the array */
        partial_sum = 0;
        for(i = 0; i < n_portion; i++)
            partial_sum += local[i];

        /* send my partial sum to the root */
        MPI_Send( &partial_sum, 1, MPI_INT, root, tag2, MPI_COMM_WORLD);
    }

    MPI_Finalize(); 
    return 0;
}

The error I took is:

-bash-4.1$ mpirun -np 3 distApprox
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 110834 on node levrek1 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Thanks for help.

Upvotes: 3

Views: 289

Answers (1)

Johannes
Johannes

Reputation: 46

I'd say the problem lies in the MPI_Recv on the worker side. You should use 'local' instead of '&local' as the buffer. MPI expects the "initial adress of the receive buffer" (see MPI standard), which in the case of dynamic arrays is the array variable itself.

MPI_Recv( local, n_portion, MPI_INT, root, tag1, MPI_COMM_WORLD, &status);

You may also want to initialize 'total_sum' to 0 on root and then your code should run.

Edit: Just saw that Martin Zabel already pointed this out in the comments

Upvotes: 3

Related Questions