Amr Ayman
Amr Ayman

Reputation: 1159

Why is valgrind detecting so many memory allocations in a simple while loop

I have this code:

FILE *stream;
char *buf_test = calloc(1024, sizeof(char));

size_t index = 0;
stream = fopen("test.txt", "r");
while (fgets(&buf_test[index], 1024, stream) != NULL)
    index = strlen(buf_test);
(buf_test[index-1] == '\n') ? buf_test[index-1] = 0 : 0;
printf("test.txt: %s\n", buf_test);
fclose(stream);
free(buf_test);

When I run valgrind to check if it manages memory correctly I get:

total heap usage: 2 allocs, 2 frees, 1,592 bytes allocated All heap blocks were freed -- no leaks are possible

but when I change this:

stream = fopen("test.txt", "r");
while (fgets(&buf_test[index], 1024, stream) != NULL)
    index = strlen(buf_test);

to this:

while (fgets(&buf_test[index], 1024, (stream = fopen("test.txt", "r"))) != NULL)
    index = strlen(buf_test);

I get a segmentation fault and valgrind detects 580,952 bytes allocated but still reachable ..

What is happening here ?

Upvotes: 0

Views: 201

Answers (1)

nos
nos

Reputation: 229224

What is happening is that:

  • Each iteration of the loop you open file
  • fopen creates and returns a FILE*, and that might dynamically allocate memory for the FILE* or any internal members of that FILE*
  • A process can normally only open a limited number of files
  • Eventually you hit that limit, and fopen() fails (it returns a NULL pointer)
  • You're then passing that NULL pointer to fgets(), which is undefined behavior, and causes your program to crash.

Upvotes: 3

Related Questions