Doug
Doug

Reputation: 49

c code has small limit

I am test and tuning a protocol. When I run the following code, it crashes when I make my array size larger say..300000. It runs at 200000. The number seems awfully small. Is it my PC, my code? Please help. My goal is to see how fast I can do simple math, write to a file,close the file, read from the file after opening and write to another file. I am using Codeblocks GNU GCC compiler on win7. I also get different numbers from what I expect at the bottom of dig2.txt.( not increasing by 2)

#include <stdio.h>
int main() {
printf("Hello Test\n");
FILE *fp;
FILE *fp2;
FILE *fp3;
int a;
int sizofFile = 300000;
int pix[sizofFile];
int pix2[sizofFile];
fp = fopen("dig.txt", "w+");
for (a = 0; a<(sizofFile); a = a + 1){
    pix[a]=a * 2; //load the array
    }
    fwrite(pix, sizeof(a), sizofFile, fp);  //stream data to file
printf("Good bye Test \n");
fclose(fp); //close first file
fp2 = fopen("dig.txt", "r"); // reopen file to read
fp3 = fopen("dig2.txt", "w+"); // open a write file for human vision
fread(pix2, sizeof(a), sizofFile, fp2);
for (a = 0; a < sizofFile; a = a + 1){
    fprintf(fp3, "%d \n", pix2[a]); // make human readable
    }
fclose(fp2);
fclose(fp3);
return 0;
}

Upvotes: 0

Views: 50

Answers (1)

Jean-Fran&#231;ois Fabre
Jean-Fran&#231;ois Fabre

Reputation: 140216

Local/auto arrays storage size depends on the stack size / auto variable storage. You could increase it using compiler options, but that's not the best way.

You should use data allocated with malloc instead, the limit is much, much higher (only depends on the process memory limit / physical memory of the computer). That's how big arrays should be allocated.

int *pix = malloc(sizofFile * sizeof(int));
int *pix2 = malloc(sizofFile * sizeof(int));

use like you're doing now, then before leaving free the memory

free(pix); free(pix2);

Upvotes: 1

Related Questions