Reputation: 2528
We are finding some strange behaviour with the decimals being displayed to the console. This is best discribed by the following code:
string num1AsString = "1.0000";
decimal num1AsDecimal = decimal.Parse(num1AsString);
string num2AsString = "1";
decimal num2AsDecimal = decimal.Parse(num2AsString);
Console.WriteLine(num1AsDecimal);
Console.WriteLine(num2AsDecimal);
The output to the Console is:
1.0000
1
However num1AsDecimal and num2AsDecimal show up in the debugger as 1 only. How is num1AsDecimal somehow preserving the .0000? It seems to be somehow holding onto its string representation. Is there some kind of boxing going on here?
Billy Stack
Upvotes: 0
Views: 140
Reputation: 62459
A footnote on the MSDN page of Decimal.Parse
says:
The decimal result of decimal.Parse will have the same number of significant digits as the string it was parsed from. That is, "3.43" and "3.4300" while parsing to the same numeric value, result in different decimal representations. This is (somewhat) accounted for in the description of the binary representation of a decimal.
Upvotes: 2
Reputation: 1502206
We are finding some strange behaviour with the decimals being displayed to the console.
It's not strange at all. It's documented:
The scaling factor also preserves any trailing zeroes in a Decimal number. Trailing zeroes do not affect the value of a Decimal number in arithmetic or comparison operations. However, trailing zeroes can be revealed by the ToString method if an appropriate format string is applied.
In other words, it's the debugger behaviour which is slightly strange here, not ToString()
. In the debugger if you watch num1AsDecimal.ToString()
, that should preserve the trailing zeroes too.
Upvotes: 2