Reputation: 98
I'm trying to save some numbers in a decimal variable, e.g. the value could be:
Dim someDecimalVar as Decimal = 210.00483839999998
The ouput is 210.0048384
but I need the exact value.
Why do the numbers lose accuracy when stored in decimal
/double
and how do I prevent this?
Edit: I get the value from an object (trying to make a copy of them) which saves them as double, how is this possible?
Upvotes: 1
Views: 223
Reputation: 43743
Without some example code demonstrating the problem to provide context, it's difficult to say for sure what your exact problem is or how to fix it. However, perhaps this additional information will help you. You provided the following example:
Dim someDecimalVar as Decimal = 210.00483839999998
The value 210.00483839999998
is called a literal (i.e. a hard-coded value). All literals have a specific type, even if the type isn't explicitly stated. When no type is specified, the compiler infers the type based on the value. So for instance, it will assume that 1
is an Integer
, 1.2
is a Double
, and "Hello"
is a String
. So, in your example code, the number literal is interpreted by the compiler as a Double
literal, since you didn't specify. In other words, your code is, more or less, equivalent to the following:
Dim someDoubleVar as Double = 210.00483839999998
Dim someDecimalVar as Decimal = someDoubleVar
So, how can you specify the type of a literal? In VB, you do so by adding a type character as a suffix at the end of your literal. So, if you want to force the compiler to interpret that value as a Decimal
, you need to add a D
to the end, like this:
Dim someDecimalVar as Decimal = 210.00483839999998D
Actually, unless you add the type character to force it to a Decimal
, the compiler will give you an error saying:
Option Strict On disallows implicit conversions from 'Double' to 'Decimal'.
What that means is, the compiler knows that the conversion from Double
to Decimal
may cause the value to lose some precision, so it refuses to do the conversion automatically; it wants you to manually specify that you want to do the conversion so that it knows you're aware of the implications. It's an extra safety check that the compiler is doing for you just to ensure that you don't accidentally do anything stupid. The only way to get that to compile without the type character is to turn Option Strict Off
, so I have to assume that's what you've done. I would strongly suggest turning Option Strict On
since it helps to catch these kinds of problems.
Any time you convert between decimals and doubles, the value can lose some precision in the conversion (since they store the value in memory in two different, sometimes incompatible, ways). So, observe:
Dim someDoubleVar As Double = 210.00483839999998R ' Setting Double to Double retains precision
Dim someDoubleVar2 As Double = 210.00483839999998D ' Setting Double to Decimal loses precision
Dim someDecimalVar As Decimal = 210.00483839999998D ' Setting Decimal to Decimal retains precision
Dim someDecimalVar2 As Decimal = 210.00483839999998R ' Setting Decimal to Double loses precision
Note: the above code will only compile with Option Strict Off
. I know... I just said not to do that, but I'm trying to illustrate why your code is doing what it's doing.
Upvotes: 4
Reputation: 4439
I had a quick play around with the number in your question, and if you convert the double to a decimal explicitly when you assign it to the decimal variable, You get the correct number.
E.G.
Dim x As Double = 210.0048383999998
Dim someDecimalVar As Decimal = CDec(x)
Upvotes: 1