Nikolay Shebanov
Nikolay Shebanov

Reputation: 1443

Default rounding strategy in Intl.NumberFormat.prototype.format()

I'd like to understand the behavior of decimal rounding that happens implicitly when using Intl.NumberFormat.prototype.format(). The MDN documentation of the format function lacks a clear description of that behavior.

Let's take the following piece of code as an example:

const amount = 654321.985;
const formattedAmount = new Intl.NumberFormat('en-EN', {
    style: 'currency',
    currency: 'EUR',
}).format(amount);

console.log(formattedAmount); // => €654,321.99

What is the rounding strategy in this case and what it depends on? I would assume that there are many factors involved like the browser, locale and currency, but couldn't find a confirmation to that.

After playing around with this sample, I can see that the decimal part is rounded up a cent, and down for anything below, i.e. .9849 becomes .98. The strategy seems to follow the nearest integer, ties to even rule. At the same time, I understand that the logic can be more complicated given the ECMAScript specification.

Upvotes: 5

Views: 1931

Answers (1)

BogdanBiv
BogdanBiv

Reputation: 1518

From MDN examples, see below:

Intl.NumberFormat('en-EN', {
    style: 'currency',
    currency: 'EUR',
}).resolvedOptions()

// properties
currency: "EUR"
currencyDisplay: "symbol"
currencySign: "standard"
locale: "en"
maximumFractionDigits: 2
minimumFractionDigits: 2
minimumIntegerDigits: 1
notation: "standard"
numberingSystem: "latn"
signDisplay: "auto"
style: "currency"
useGrouping: true
__proto__: Object

TC39 document you provided at Section 13.2 and 13.3 seem pretty informative:

// typing Intl.NumberFormat in console results takes you nowhere, but
// typing `Intl.NumberFormat.prototype` in console results in:
constructor: ƒ NumberFormat()
format:
formatToParts: ƒ formatToParts()
resolvedOptions: ƒ resolvedOptions()
Symbol(Symbol.toStringTag): "Intl.NumberFormat"
get format: ƒ format()

13.3.3 Internal slots:

must be string values that must contain the substring "{number}". "positivePattern" must contain the substring "{plusSign}" but not "{minusSign}";

patterns seem to be of the shape positivePattern = "{plusSign}{number}"

also take a peak at 13.4.5 Intl.NumberFormat.prototype.resolvedOptions ( ):

[[MinimumIntegerDigits]]    "minimumIntegerDigits"
[[MinimumSignificantDigits]]    "minimumSignificantDigits"
[[MaximumSignificantDigits]]    "maximumSignificantDigits"

Also, don't forget to check MDN: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Intl/NumberFormat

Upvotes: 3

Related Questions