Reputation: 3939
Firstly, I'm a bit new to Python, I know this floating point arithmetic seems very basic but I can't find any duplicate/related question in SO
I have an acceptance test: expect 3.3 / 3
to be 1.1
Then I tried..
from decimal import *
>>> Decimal(3.3) / Decimal(3)
Decimal('1.099999999999999940788105353')
>>> Decimal(3.3) / Decimal(3.0)
Decimal('1.099999999999999940788105353')
>>> Decimal('3.3') / Decimal('3')
Decimal('1.1') # as expected
Question: What is the best practice to use Python decimal in predictable ways? or that I just need to format every decimal display to string?
To be more specific: I'm writing a small automation script for loan data report.
Upvotes: 1
Views: 165
Reputation: 599856
The point is that in the bare float 3.3
to Decimal you're already subject to floating-point imprecision:
>>> Decimal(3.3)
Decimal('3.29999999999999982236431605997495353221893310546875')
So, yes, you should always pass strings.
Upvotes: 5
Reputation: 1695
Looking at https://docs.python.org/2/library/decimal.html, it is possible to set the precision for your operations. The default is 28 decimal points.
from decimal import * getcontext().prec = 2 Decimal(3.3) / Decimal(3)
This returns "Decimal('1.1')"
Upvotes: 3