Reputation: 1689
I have been testing out Decimal in Python3, and I have come across some strange things that did not make any sense to me.
First of all, I imported Decimal
from decimal import *
Next, I set to what accuracy (in digits) I want any calculations to be
getcontext().prec = 50
Then, I defined and printed a variable called num
, which was equal to 0.6 recurring
num = Decimal(2/3)
However, when I try and print num
I get this
print(num)
0.66666666666666662965923251249478198587894439697265625
Also, changing to either of these:
getcontext().prec = 500
getcontext().prec = 3
Changes nothing, even as 3 it gives the same output
So theres two things that I don't understand with this
0.6666666666666666
? I was expecting it to say 0.6
with as many 6's
as the number defined in the getcontext().prec
getcontext().prec = 3
supposed to make it 3 digits long? Because its still doing a lot more than that, and getcontext().prec = 500
also doesn't make it anywhere near 500 digits longEdit:
I am using Python3 on Windows
Upvotes: 1
Views: 1079
Reputation: 133998
You're converting the result of division 2/3, which is a IEEE 754 double-precision floating point approximation of the result of 2/3, i.e. 0.6666... into Decimal
. Divide a Decimal
by another, say, and you get
>>> from decimal import *
>>> getcontext().prec = 50
>>> Decimal(2) / Decimal(3)
Decimal('0.66666666666666666666666666666666666666666666666667')
However these are still approximations. If you're dealing with fractions only (just doing arithmetic), use fractions.Fraction
:
>>> import fractions
>>> fractions.Fraction('2') / 3
Fraction(2, 3)
>>> str(_)
2/3
Upvotes: 4