Worm
Worm

Reputation: 1451

Time measuring accuracy

By implementing the line

start_time = time.time()

at the start of my code and

print("%f seconds" % (time.time() - start_time))

at the end of my code I have been measuring the performance of my script (which can take hours to run). I have heard that this may not be the best method due to it being inaccurate. How accurate is it and is there a better alternative?

Upvotes: 2

Views: 692

Answers (2)

user7214612
user7214612

Reputation:

Try usng datetime

from datetime import datetime
startTime = datetime.now()
#CODE
print("Time taken:",datetime.now() - startTime)

Upvotes: 0

ifnotX
ifnotX

Reputation: 73

Try this, timeit from the standard library:

from timeit import default_timer as timer
start_time = timer()
end_time = timer()    
print(end_time - start_time)                                                                                                                                           
logger.info("Duration was {}".format(end_time - start_time)) 

The documentation for default_timer is of interest, and should really be quoted in the answer: "Define a default timer, in a platform-specific manner. On Windows, time.clock() has microsecond granularity, but time.time()’s granularity is 1/60th of a second. On Unix, time.clock() has 1/100th of a second granularity, and time.time() is much more precise. On either platform, default_timer() measures wall clock time, not the CPU time. This means that other processes running on the same computer may interfere with the timing."

Upvotes: 2

Related Questions