Reputation: 41
I've seen many #defines for literals using a type cast.
As example #define THE_ANSWER ((uint8_t) 42)
.
Until now, I can hardly imagine a situation where this really matters.
Can someone give me an example where a #define directive without a type cast leads to an "unexpected" behaviour. Maybe there is a good example for desktop environment and one for embedded or microcontroller environment.
Thank you for your suggestions.
Upvotes: 1
Views: 1007
Reputation: 485
One example is multiplication. Say you have the following:
#define KILO 1024
what happens if you later on do the following?
unsigned long long val = KILO * KILO * KILO * KILO * KILO * ...
you might think that the multiplication would get evaluated with the type unsigned long long
, but in reality all those multiplications happen on int
. So if your directive doesn't have a typecast, or better yet it isn't 1024L
you might end up with unexpected behavior because of an int
overflow, even though your variable's type can actually hold the result.
Upvotes: 1