Reputation: 465
I am puzzled by the output of this code:
#include <stdio.h>
#include <stdlib.h>
#define TIMING_OUTPUT_FILENAME_MAX_LENGTH 40
int main (int argc, char **argv)
{
char *timing_output_filename = malloc(TIMING_OUTPUT_FILENAME_MAX_LENGTH);
printf("requested buffer size is %ld bytes and pointer size is %ld bytes\n",
sizeof(timing_output_filename),
sizeof(*timing_output_filename));
return 0;
}
and here is the output:
requested buffer size is 8 bytes and pointer size is 1 bytes
It's supposed to return me 40 bytes? Am I missing something?
Upvotes: 0
Views: 263
Reputation: 134286
It's supposed to return me 40 bytes?
No, it's not.
In your code, timing_output_filename
is a pointer and using sizeof
operator on a pointer yields the size of the pointer itself, not the size of the allocated memory (to the pointer).
In your code, sizeof(timing_output_filename)
is the same as sizeof(char *)
, so it produces the size of a pointer to char
as in your platform.
Note: sizeof
produces a result of type size_t
, so you should use %zu
format specifier to print the result.
Upvotes: 5