Reputation: 9
Basically, I have this function which counts the number of lines in a text file and it is giving me a headache. It works like a charm with small files (say with 50000 lines). However, for some reason, I get a segmentation fault error when I try to count the lines in a file that has 1 million lines. This is the code:
int countlines(char *filename)
{
// count the number of lines in the file called filename
FILE *fp = fopen(filename,"r");
int ch=0;
int lines=0;
if (fp == NULL)
return 0;
while ((ch = fgetc(fp)) != EOF)
{
if (ch == '\n')
lines++;
}
fclose(fp);
return lines;
}
I have honestly tried a thousand variations of this and I can't figure out what's wrong. It reaches the line count of 1000000 but then it gives me a Segmentation fault error. Any help will be greatly appreciated!
Edit: Since everyone is saying it works for them, I'll show you what I have in my main function.
int main(int argc, const char * argv[])
{
int X_len = countlines("/homes/myworkspace/X.txt");
int X[X_len][4];
printf("\n X_len = %d",X_len);
}
Upvotes: 1
Views: 168
Reputation: 108968
This is the problem
int X[X_len][4];
With X_len greater than 1000000 you do not have enough memory for the (stack) array.
Try dynamic allocation (heap) instead
int (*X)[4];
X = malloc(X_len * sizeof *X);
if (X == NULL) /* error */;
// ...
free(X);
Upvotes: 2
Reputation: 5528
I'd say the problem is here:
int X[X_len][4];
this allocates memory on stack which is probably only 4MB, that would explain failing after 1 000 000 lines.
I suggest allocating it on heap:
int *X = (int*)malloc(X_len * sizeof(int) * 4);
Upvotes: 0