Reputation: 45
I want to do arithmetic with converted number strings using decimal.Decimal. I want the return to have two decimals regardless of the quantity of decimals the converted number strings have.
In this particular case, why does two decimal precision give me one decimal return and why does three decimal precision give me the return I want (2 decimals -> 2.15)
from decimal import *
getcontext().prec = 2
a = Decimal(x.replace(',','.'))
b = Decimal(y.replace(',','.'))
print(a), print(b), print(a-b)
>>> 2.60 0.45 2.2
getcontext().prec = 3
a = Decimal(x.replace(',','.'))
b = Decimal(y.replace(',','.'))
print(a, b, a-b)
>>> 2.60 0.45 2.15
Upvotes: 2
Views: 11448
Reputation: 121
The getcontext().prec
set the precision of digits which Decimal lib will use to return any calculated number, that is, prec = 2 will return 2 digits (first=2, second=2) and in prec = 3 that will show three numbers(first=2, second=1, third=5).
However , there is a trick here. When you created the a and b variables, the Decimal used as precision the amount of digits from the number you gave to it. You can see that just calling the a or the b in your IDLE.
Upvotes: 8