Reputation: 4108
I have a simple program that gives me strange results in the Visual Studio debugger.
Console.WriteLine("4280.159d * 1000000 = {0}", 4280.159d * 1000000);
double D = 4280.159d * 1000000;
Console.WriteLine("D = {0}", D);
Console.WriteLine("Put a breakpoint here and look at value of D in debugger");
This prints out the following:
4280.159d * 1000000 = 4280159000
D = 4280159000
This output is correct but if you look at it in the debugger the value of D shows as 4280158999.9999995.
If you type the same equation in the immediate window you get the same incorrect answer.
The only way to get what seems to be the correct answer is to cast to decimal in the debugger
(decimal)4280.159d * 1000000
4280159000
The behaviour is the same in VS2010 and VS2012 and it doesn't seem to matter if it is 64bit or 32bit. Why does it the debugger give a different result from the running code and is there a way to fix it?
Upvotes: 0
Views: 891
Reputation: 48134
The debugger shows the correct value. The WriteLine
call is rounding up the decimal and converting it to a string so that it can be displayed.
Upvotes: 6