jgangani
jgangani

Reputation: 25

Segmentation Fault: Dynamically allocating large integer array

I have following code with one dynamically allocated array "data". I am passing array size as a command line argument. The program works fine until datasize = 33790. It gives segmentation fault if I try to provide a value > 33790.

"33790" might be machine specific. I am trying to understand why a dynamically allocated memory would return seg fault after a particular size. Any help is welcome. :)

#include "iostream"
#include <stdlib.h>
#include "iomanip"
#include "ctime"

#define N 100000

using namespace std;

int main(int argc, char* argv[])
{
    int a;
    cout<<"Size of int : "<<sizeof(int)<<endl;

    long int datasize = strtol(argv[1],NULL,0);
    cout<<"arg1 : "<<datasize<<endl;
    double sum = 0;
    int *data;
    data = new int(datasize);

    clock_t begin = clock();
    for(int i = 0; i < N; i++)                              //repeat the inner loop N times
    {
        //fetch the data into the cache
        //access it multiple times in order to amortize the compulsory miss latency
        for (long int j = 0; j < datasize; j++)
        {
            sum += data[j];                                 //get entire array of data inside cache
        }
    }

    clock_t end = clock();

    double time_spent = (double) (end - begin);

    cout<<"sum = "<<sum<<endl;
    cout<<"Time Spent for data size = "<<argv[1]<<" is "<<time_spent<<endl;

    delete[] data;

    return 0;
}

Upvotes: 1

Views: 235

Answers (1)

MikeCAT
MikeCAT

Reputation: 75062

You are not allocating any arrays (having multiple elements) but allocating only one int having value datasize.

Use new int[datasize] instead of new int(datasize) to allocate an array of int having datasize elements.

Upvotes: 2

Related Questions