Reputation: 480
Consider the following piece of code:
void function (char arg)
{
// function's content
}
int main(void)
{
long double x = 1.23456;
function(x);
}
I'm giving the function an argument it is not supposed to get. Why does it not cause an error?
Upvotes: 3
Views: 106
Reputation: 263487
It's converted implicitly.
In the context of an assignment, argument passing, a return
statement, and several others, an expression of any arithmetic type is implicitly converted to the type of the target if the target type is also arithmetic. In this case, the double
argument is implicitly converted to char
. (That particular conversion rarely makes sense, but it's valid as far as the language is concerned.)
Note that this implicit conversion is not done for variadic arguments (for example arguments to print
after the format string), because the compiler doesn't know what the target type is. printf("%d\n", 1.5)
doesn't convert 1.5
from double
to int
; rather it has undefined behavior.
There are also some rules for evaluating expressions with operators of different types. I won't go into all the details here, but for example given:
int n = 42;
double x = 123.4;
if you write n + x
the value of n
is promoted (implicitly converted) from int
to double
before the addition is performed.
Upvotes: 4
Reputation: 251
In your example, the double
type is implicitly converted to a char
.
Upvotes: 2