Tim
Tim

Reputation: 1769

.NET default formatting of decimal

Why have theses 2 codes different behaviour ?

decimal test = 53M;
var label = "Some thing " + test + " other thing";
Console.WriteLine(label);

test = 53.00M;
label = "Some thing " + test + " other thing";
Console.WriteLine(label);

Displays :

Some thing 53 other thing

Some thing 53,00 other thing

Upvotes: 2

Views: 419

Answers (1)

Dmitrii Bychenko
Dmitrii Bychenko

Reputation: 186728

If we consult binary representation of Decimal

https://learn.microsoft.com/en-us/dotnet/api/system.decimal.getbits?view=netframework-4.8

The binary representation of a Decimal number consists of a 1-bit sign, a 96-bit integer number, and a scaling factor used to divide the integer number and specify what portion of it is a decimal fraction. The scaling factor is implicitly the number 10, raised to an exponent ranging from 0 to 28.

(bold is mine, Dmitry Bychenko)

We can easily explain the difference between 53M and 53.00M:

 53M     == {Integer Number:   53; Scaling Factor: 0} ==   53 / 10**0 
 53.00M  == {Integer Number: 5300; Scaling Factor: 2} == 5300 / 10**2

Upvotes: 6

Related Questions