Reputation: 497
I have a statement in C code which I suspect may be giving me periodic errors, so want to make sure I am doing the right thing as it mixes types. Objective is to change timebase from 1/32768 seconds to 1/1024, with all times 32 bit integers.
What I have is this:
ts_sys = latest_timestamp * VELO_TICKS_FROM_RTC;
Where ts_sys and latest_timestamp are both unsigned 32 bit integers. VELO_TICKS_FROM_RTC is a define as follows:
#define VELO_TICKS_PER_SECOND 1024
#define VELO_TICKS_FROM_RTC (VELO_TICKS_PER_SECOND / 32768.0f)
Should I be using a cast here to make sure the division doesn't return an integer (which would be zero) and therefore return the wrong thing? For example would this be better:
ts_sys = (uint32_t) ((float)latest_timestamp * VELO_TICKS_FROM_RTC);
but that seems like overkill..
Upvotes: 0
Views: 47
Reputation: 15042
"Should I be using a cast here to make sure the division doesn't return an integer?"
No, 1024
is of type int
and latest_timestamp
is of type uint32_t
. Both get converted to float
in the arithmetic expressions before the respective calculation is done:
Otherwise, if the corresponding real type of either operand is float, the other operand is converted, without change of type domain, to a type whose corresponding real type is float.
C18, §6.3.1.8/1; "Usual arithmetic conversions"
"but that seems like an overkill..."
It is.
Upvotes: 0
Reputation: 48278
Should I be using a cast here to make sure the division doesn't return an integer (which would be zero) and therefore return the wrong thing?
no, you are doing A/B, and B is a float, so the compiler promotes A to float and the result is a float!
Upvotes: 3