Reputation: 1
I am converting a string from hex to decimal. The problem is that in Visual Studio compiler the conversion returns a wrong value. However when I compile the same code in a Mac at the terminal using the g++ compiler, the value is returned correctly.
Why this is happening?
#include <string>
#include <iostream>
using namespace std;
int main()
{
string hex = "412ce69800";
unsigned long n = strtoul( hex.c_str(), nullptr, 16 );
cout<<"The value to convert is: "<<hex<<" hex\n\n";
cout<<"The converted value is: "<<n<<" dec\n\n";
cout<<"The converted value should be: "<<"279926183936 dec\n\n";
return 0;
}
output:
Upvotes: 0
Views: 408
Reputation: 41874
Because in Windows long
is a 32-bit type, unlike most Unix/Linux implementations which use LP64 memory model in which long
is 64 bits. The number 412ce69800
has 39 bits and inherently it can't be stored in a 32-bit type. Read compiler warnings and you'll know the issue immediately
C standard only requires long
to have at least 32 bits. C99 added a new long long
type with at least 64 bits, and that's guaranteed in all platforms. So if your value is in 64-bit type's range, use unsigned long long
or uint64_t/uint_least64_t
and strtoull
instead to get the correct value.
Upvotes: 2