Reputation: 11
I created a small test in order to create memory leaks and testing the leaks in terminal using leaks command. Now I encountered a strange behavior with the NULL. So any explanation why the other code leaks, and the other one doesn't? Aren't they really the same?
int main(void)
{
char *ptr;
char *btr;
ptr = NULL;
btr = (char*)malloc(4);
btr = ptr;
while (1)
;
return (0);
}
// LEAKS
int main(void)
{
char *btr;
btr = (char*)malloc(4);
btr = NULL;
while (1)
;
return (0);
}
//NO LEAKS ?? why
Upvotes: 0
Views: 509
Reputation: 1
Unfortunately I can't comment, but some people have commented that it would the result of compiler optimization.
However, if we copy the original non-leaky code and change the value to a larger number, we can clearly see that it is allocating memory:
int main(void)
{
char *btr;
btr = (char*)malloc(1000000);
btr = NULL;
while (1)
;
return (0);
}
Leaks output:
Process 9945: 162 nodes malloced for 994 KB
Process 9945: 0 leaks for 0 total leaked bytes.
If we comment the allocation out:
Process 10070: 161 nodes malloced for 14 KB
Process 10070: 0 leaks for 0 total leaked bytes.
So clearly the allocation does happen and leaks is aware of it (+1 node, +circa 1 MiB). If the compiler (clang, Hive Mac default settings, no special flags) has supposedly optimised it out, how can leaks know about the allocation?
Upvotes: 0
Reputation: 9209
If anything I would hazard that you have compiler optimisation turned on. In the second example this will likely ignore the line
btr = (char*)malloc(4);
during compilation as you immediately set the result to NULL.
Upvotes: 1