Reputation: 571
I'm trying to build a program which has to do with very short time differences and I need a very accurate way to know how much time a statement took. When I use time.time(), even with hundreds of characters, it shows me that it took 0.0. (I measure the time.time() before and after and then subtract). I checked to see if python compares string char by char or not with this code:
time1 = time.time()
print s1 == s2
print (time.time() - time1)
time2 = time.time()
print s1 == s3
print time.time() - time2
(when s1,s2,s3 are very long strings, s2 is the same as s1 but the last char is different, and s3 is the same as s3 only that the first char is different) but the case I'm dealing with is a string that is about 10 characters, sometimes even less, and I need to find out how much time it takes to compare it to something else). Any help?
Edit: time.clock() worked great for me - it is much more accurate than time.time() (on Windows)
Upvotes: 3
Views: 1794
Reputation: 414079
You want time.perf_counter()
:
a clock with the highest available resolution to measure a short duration
timeit.default_timer()
uses it on Python 3. On Python 2, it calls time.clock()
on Windows and time.time()
on other systems.
Both time.clock()
and time.perf_counter()
use QueryPerformanceCounter()
on Windows so you could use timeit.default_timer()
on both Python 2 and 3 on Windows.
time.perf_counter()
(via pymonotonic(NULL)
) calls clock_gettime(CLOCK_HIGHRES)
on Linux and mach_absolute_time()
on OS X. You could call clock_gettime()
using ctypes
on Python 2 too (or find a backport that does it for you).
Upvotes: 4