Reputation: 7261
When timing my (web) application for performance/latency, should I use the minimum time measured in n runs, or the average? Or something else?
If it's the latter case, when to use what?
If your answer is going to be something along the lines of, "research it, dude", could you point me to a good resource?
Upvotes: 1
Views: 117
Reputation: 7261
http://www.webperformancetoday.com/2012/02/13/non-geeky-guide-to-performance-measurement/ This is a good article on "response time" timing. (Websites.)
Upvotes: 0
Reputation:
I would say you have to figure that one out your self. Are you benchmarking for
Depending on what you want your analysis to answer different metrics should be used.
Upvotes: 0
Reputation: 74655
Use median rather than mean (or average).
For reasoning, see the page Mean Delay Considered Harmful by Stanislav Shalunov (the author of thrulay).
Upvotes: 1
Reputation: 5010
You could retrieve the values, store them by time, with couple (time_of_the_call, response_time). Then you may treat these data with tools, draw graphs, make statistics. I think an average, minimum, etc, is not sufficient, you need a set of measures.
For example you may put you data in a csv file and import in excel, or even use google graph api to draw real time graphs.
Upvotes: 1