Reputation: 1595
I have an (C# .NET Core
/ SQL Express
) application that is very critical with money calculations. I've read a lot of blog and posts saying that float
is not precise enough for storing critical things like money values. So I used decimal
.
But now when I'm digging deeper in it, it seems to be the best idea to store everything as integer
and just use the cent value instead of the decimal value for money. So instead of storing a value as € 1.99
. I store it as 199
.
My first question: Is this really a better and safer approach?
Secondly, I have already a complete application using Entity Framework
that is based on decimal values in every function that uses prices.
I thought, as only storing the values in the database might be the problem, to use the getter and setter of the variable to convert it to and back from integers.
So for example:
public class InvoiceDomain
{
public decimal Total {get;set;}
}
becomes this:
public class InvoiceDomain
{
public int totalCents
[NotMapped]
public decimal Total
{
get { return totalCents / 100; }
set { totalCents = (int)value * 100; }
}
}
Is that a good idea to do?
Upvotes: 4
Views: 1582
Reputation: 199
I would go on using the decimal
. If you have to calculate fractions, int / int
will end up as int
. You'll have to do conversion and use a decimal
or double
again.
Upvotes: 0
Reputation: 726579
I would stay away from using integers, because storing the exact number of cents becomes problematic when you need to use fractions of a cent. A need to do so may come up as an intermediate result of a computation, even when you do not intend to store fractional cents.
Unlike float
s, decimals are perfect for storing monetary amounts. SQL Server also has money data types (money
and smallmoney
) which work perfectly fine, too, but require more care while doing math.
Upvotes: 4