Reputation: 2057
The following code performs better in Python 2.x than in Python 3.x
from time import time
n = 100000000
s = time()
while n > 0:
n -= 1
print (time() - s)
In 2.7, it yields 7.84500002861
and 14.969856023788452
on 3.4 (on my machine of course).
This is due to Python 3.x removing special treatment for small int's and treating all integers similar to long
in 2.x
Is there a way to get similar(or better) performance in Python 3.x for integers operations? Is there an int
type similar to the one in 2.x?
Also, I don't get why this change was made, it hurts performance and I don't see much benefits.
Upvotes: 0
Views: 389
Reputation: 48546
You're doing one hundred million subtractions in around ten seconds. That's ten per microsecond, or a hundred sixty thousand per frame.
Somehow I doubt this is actually your bottleneck. Have you measured your actual code? What made you decide to measure and blame int performance specifically?
For what it's worth I get similar results to razpeitia: about half the difference in Python 3 is due to the comparison.
Upvotes: 2
Reputation: 1997
Stop assuming stuff before actually profiling the code.
What it's expensive is the comparison, no the addition or subtraction.
It's insanely faster with a different comparison:
from time import time
n = 100000000
s = time()
while n:
n -= 1
print (time() - s)
Comparisons in python has been always expensive.
Time:
Python 2.7: With comparison
7.28864598274
Python 3.4: With comparison
13.904897212982178
Python 2.7: Without comparison
7.14143395424
Python 3.4: Without comparison
9.274619102478027
Upvotes: 1
Reputation: 69051
Also, I don't get why this change was made, it hurts performance and I don't see much benefits.
Python is not a number crunching language, and it's not about the absolute best performance. It's about simplicity, clarity, and correctness. The extra bit of speed was not great enough to justify the maintenance burden of two different integer types for the folks who actually develop and maintain Python.
Upvotes: 6