santhosh kgn
santhosh kgn

Reputation: 41

What is the need for (char*) typecasting in the own sizeof API?

#include<stdio.h>
#define my_sizeof(type) (&type+1) - (&type) 

int main()
{   
 int y;
 printf("size_of int: %ld\n", sizeof(y));
 printf("address of y = %x \n",&y);
 printf("address of y +1 = %x \n", &y+1);

 printf("The sizeof = %d\n", my_sizeof(y));
 getchar();
 return 0;
}

Output:

  size_of int:  4
  address of y =  26f890
  address of y +1 =  26f894
  The sizeof =  1

I am expecting the my_sizeof output as "4" (i.e., 26f894  - 26f890)  but it is printing as "1".

Suppose if I typecast it as char* (i.e., (char*) (&type+1) - (char*)(&type)) the output is "4".

Can anyone tell me the need for (char*) typecasting.?

Upvotes: 0

Views: 86

Answers (2)

Vlad from Moscow
Vlad from Moscow

Reputation: 310950

In this expression

(&type+1) - (&type)

there is used the pointer arithmetic. The difference between two pointers that point to elements of the same array or one past the last element of the array is equal to the number of elements between two pointers.

And an object of the type int in your system occupies 4 bytes then casting the pointers to the type char * yields 4.

From the C Standard (6.5.6 Additive operators)

9 When two pointers are subtracted, both shall point to elements of the same array object, or one past the last element of the array object; the result is the difference of the subscripts of the two array elements....

For the pointer arithmetic a single object is considered as an array having one element.

Upvotes: 1

0___________
0___________

Reputation: 67476

I have corrected plenty UBs in the printfs

#include<stdio.h>
#define my_sizeof(type) ((char *)(&type+1) - (char *)(&type))

int main()
{   
 int y;
 printf("size_of int: %zu\n", sizeof(y));
 printf("address of y = %p \n",(void *)&y);
 printf("address of y +1 = %p \n", (void *)(&y+1));

 printf("The sizeof = %zu\n", my_sizeof(y));
 getchar();
 return 0;
}

Upvotes: 0

Related Questions