Reputation: 11885
As you probably know, 0.1 + 0.2 == 0.30000000000000004
in JavaScript. Limited precision when dealing with money can be a problem. To overcome this, one could use BigInt, which obviously requires integer values.
Instead of dollars as base unit and cents as decimal fraction (e.g. 0.10
for ¢10) it is easy to use cents as base unit (10
for ¢10) to make it integer-only.
However, when presented to a user, you probably want to show the value in dollars, not cents. But you can't divide BigInt
by Number
:
> BigInt(10) / 100
Uncaught TypeError: Cannot mix BigInt and other types, use explicit conversions
On the other hand, Intl.NumberFormat does not seem to provide a way to do something like this either. So, what can you do instead?
A straightforward way thanks to base 10 would be to insert a period at position -2 into the number string and pass that to the formatter:
let formatter = Intl.NumberFormat("en-US", {
style: "currency",
currency: "USD"
})
let s = (BigInt(10) + BigInt(20)).toString() // "30"
s = s.padStart(2, "0") // zero-pad single digit (if input 0..9)
s = s.slice(0, -2) + "." + s.slice(-2) // ".30"
formatter.format(s) // "$0.30"
Is this the way to go? Are there better solutions?
I'm aware that format(0.30000000000000004)
also results in $0.30
because of rounding but this is rather a general question about BigInt + UI.
Upvotes: 3
Views: 2606
Reputation: 10499
You can divide a bigint
value by 100n
(bigint 100) to get the right number of dollars:
function toDollarsString(total) {
const dollars = total / 100n;
const cents = total % 100n;
return `\$${dollars}.${cents}`;
}
console.log(toDollarsString(30n));
console.log(toDollarsString(1024n));
Upvotes: 1