Reputation: 85
I am returning to a functional python script with the intent of optimizing the runtime. For the most part, I have been using timeit and tqmd to track how long individual functions take to run, but is there a way to run a single function and track the performance of all the commands in the python script to get a single output?
For example:
def funct_a(a):
print(a)
def funct_b(b):
complex_function(a)
def funct_c(c):
return c -5
funct_a(5)
funct_b(Oregon)
funct_c(873)
Ideally i would like to see some output of a performance check that reads like this:
funct_a runtime:.000000001 ms
funct_b runtime: 59 ms
funct_c runtime: .00000002 ms
Any ideas would be greatly appreciated
Upvotes: 3
Views: 4175
Reputation: 23166
Use the timeit
module:
import timeit
def funct_a(a):
return a
def funct_b(b):
return [b]*20
def funct_c(c):
return c-5
>>> print(timeit.timeit('funct_a(5)', globals=globals()))
0.09223939990624785
>>> print(timeit.timeit('funct_b("Oregon")', globals=globals()))
0.260303599992767
>>> print(timeit.timeit('funct_c(873)', globals=globals()))
0.14657660003285855
Upvotes: 1
Reputation: 340
Use a profiler.
I like to use a default profiler (Already included in python) called cProfile.
You can then visualise the data using snakeviz.
This is a rough way on how to use it:
import cProfile
import pstats
with cProfile.Profile() as pr:
{CODE OR FUNCTION HERE}
stats = pstats.Stats(pr)
stats.sort_stats(pstats.SortKey.TIME)
# Now you have two options, either print the data or save it as a file
stats.print_stats() # Print The Stats
stats.dump_stats("File/path.prof") # Saves the data in a file, can me used to see the data visually
Now to visualise it:
snakeviz filename.prof
For further clarification, watch this video: https://www.youtube.com/watch?v=m_a0fN48Alw&t=188s&ab_channel=mCoding
Upvotes: 7
Reputation: 122
import time
start = time.time()
#code goes here
end = time.time()
print('Time for code to run: ', end - start)
Upvotes: 1