Reputation: 679
Consider a very simple timer;
start = time.time()
end = time.time() - start
while(end<5):
end = time.time() - start
print end
how precise is this timer ? I mean compared to real-time clock, how synchronized and real-time is this one ?
Now for the real question ;
What is the smallest scale of time that can be measured precisely with Python ?
Upvotes: 0
Views: 334
Reputation: 88977
This entirely depends on the system you are running it on - there is no guarantee Python has any way of tracking time at all.
That said, it's pretty safe to assume you are going to get millisecond accuracy on modern systems, beyond that, it really is highly dependent on the system. To quote the docs:
Although this module is always available, not all functions are available on all platforms. Most of the functions defined in this module call platform C library functions with the same name. It may sometimes be helpful to consult the platform documentation, because the semantics of these functions varies among platforms.
And:
The precision of the various real-time functions may be less than suggested by the units in which their value or argument is expressed. E.g. on most Unix systems, the clock “ticks” only 50 or 100 times a second.
Upvotes: 0
Reputation: 1121266
This is entirely platform dependent. Use the timeit.default_timer()
function, it'll return the most precise timer for your platform.
From the documentation:
Define a default timer, in a platform-specific manner. On Windows,
time.clock()
has microsecond granularity, buttime.time()
‘s granularity is 1/60th of a second. On Unix,time.clock()
has 1/100th of a second granularity, andtime.time()
is much more precise.
So, on Windows, you get microseconds, on Unix, you'll get whatever precision the platform can provide, which is usually (much) better than 1/100th of a second.
Upvotes: 5