Reputation: 117
I have tried to multiply to numbers i.e. 10000
and 10000 + 1
through C program. But I am not getting the correct output.
printf("%lld",(100000)*(100001));
I have tried the above code on different compilers but I am getting same 1410165408
instead of 10000100000
.
Upvotes: 3
Views: 1522
Reputation: 186668
Well, let's multiply
int64_t a = 100000;
int64_t b = 100001;
int64_t c = a * b;
And we'll get (binary)
1001010100000011010110101010100000 /* 10000100000 decimal */
but if you convert it to int32_t
int32_t d = (int32_t) c;
you'll get the last 32 bits only (and throw away the top 10
):
01010100000011010110101010100000 /* 1410165408 decimal */
A simplest way out, probably, is to declare both constants as 64-bit values (LL
suffix stands for long long
):
printf("%lld",(100000LL)*(100001LL));
Upvotes: 6
Reputation: 213678
In C, the type which is used for a calculation is determined from the type of the operands, not from the type where you store the result in.
Plain integer constants such as 100000
is of type int
, because they will fit inside one. The multiplication of 100000 * 100001
will however not fit, so you get integer overflow and undefined behavior. Switching to long
won't necessarily solve anything, because it might be 32 bit too.
In addition, printing an int
with the %lld
format specifier is also undefined behavior on most systems.
The root of all evil here is the crappy default types in C (called "primitive data types" for a reason). Simply get rid of them and all their uncertainties, and all your bugs will go away with them:
#include <stdio.h>
#include <inttypes.h>
int main(void)
{
printf("%"PRIu64, (uint64_t)100000 * (uint64_t)100001);
return 0;
}
Or equivalent: UINT64_C(100000) * UINT64_C(100001)
.
Upvotes: 5
Reputation: 399793
Your two integers are int
, that will make the result int
too. That the printf()
format specifier says %lld
, which needs long long int
, doesn't matter.
You can cast or use suffixes:
printf("%lld", 100000LL * 100001LL);
This prints 10000100000
. Of course there's still a limit, since the number of bits in a long long int
is still constant.
Upvotes: 4
Reputation: 2949
You can do it like this
long long int a = 100000;
long long int b = 100001;
printf("%lld",(a)*(b));
this will give the correct answer.
What you are doing is (100000)*(100001)
i.e by default compiler takes 100000
into an integer and multiplies 100001
and stores it in (int)
But during printf it prints (int) as (long long int)
Upvotes: 1