Reputation: 564
I am testing my web server using Apache Bench and I am getting the following responses
Request : ab -n 1000 -c 20 https://www.my-example.com
Time per request: 16.264 [ms] (mean, across all concurrent requests)
Request : ab -n 10000 -c 100 https://www.my-example.com
Time per request: 3.587 [ms] (mean, across all concurrent requests)
Request : ab -n 10000 -c 500 https://www.my-example.com
Time per request: 1.381 [ms] (mean, across all concurrent requests)
The 'Time per request' is decreasing with increasing concurrency. May I know why? Or is this by any chance a bug?
Upvotes: 0
Views: 513
Reputation: 7614
You should be seeing 2 values for Time per request
. One of them is [ms] (mean)
whereas the other one is [ms] (mean, across all concurrent requests)
. A concurrency of 20 means that 20 simultaneous requests were sent in a single go and the concurrency was maintained for the duration of the test. The lower value is total_time_taken/total_number_of_requests
and it kind of disregards the concurrency aspect whereas the other value is closer to the mean response time (actual response time) you were getting for your requests. I generally visualize it as x concurrent requests being sent in a single batch, and that value is the mean time it took for a batch of concurrent requests to complete. This value will also be closer to your percentiles, which also points to it being the actual time taken by the request.
Upvotes: 1