Reputation: 2183
I wrote a small program to test my server's performance. The program sends 100 requests to the server every second and measures the time it takes to receive an answer. The test program is written in Java and every request is handled by a separate thread. The requests are numbered. That number is bounced back by the server so that the client program knows which request was answered. The problem it that the results I'm getting are allover the place. Sometimes a reply takes one second, sometimes 3 and sometimes 10. I can't make much sense of it. Why might this be the case? And if I'm doing something wrong what may be a better way to test my server?
EDIT For more info. The server is an Amazon Elastic Beanstalk application. It operates on a cluster of 5 EC2 instances. It runs a simple program that does some looping before sending a response containing the number of the it received in the request. The client times every request as it is sent out and and compares that against the time it takes when it receives the corresponding reply.
EDIT 2 The test runs for 30 minutes and the output is an average of all the times taken. I'm getting wildly divergent results between requests, but I'm also getting pretty big differences between individual averages.
Upvotes: 0
Views: 1425
Reputation: 2674
Where is your server located? If it is hosted somewhere remotely, then a number of thing can happen: the load on the same machine that hosts your server might vary, the network speed might vary. Also since you are bombarding your server with 100 request/second, it might be quite heavy work. Depending on many things like waiting for a shared critical resource, the respond time supposed to be varied like that. It short, without knowing the nature of your server, we do not know for sure.
What I would make out of it is that with a load of 100 request/second, your server seems to have a 10 second response in the worst case scenario. You can calculate the average time also...
Upvotes: 0
Reputation: 5538
I think what you are trying here is a load testing. It might be possible that depending on nature of data, the processing time is varying. e.g. Say, your server is parsing an XML, so depending on the size of XML data, it could take different time in responding.
I could also depend on, if there is a network call (to say a DB call or webservice call), so depending on the 3rd party response and network performance, the result could vary.
I believe your way of testing the server performance is still ok, as long as you are testing with wide range of possible data, boundary conditions and different process flow (based on business logic).
Other way is to setup an automated load testing using some tool; but that would be a good effort based on nature of the application and type of data.
Upvotes: 0
Reputation: 1616
If you are checking the performance of a webserver you can use the grider.
More info on this page http://grinder.sourceforge.net/
Upvotes: 1
Reputation: 275
Well, your internet connection speed probably fluctuate somewhat especially if you are using them for other things. The same is true for your server. That's why ping isn't a constant thing. And any server side operations due to varying traffic could be effecting the times. Your method should be fine as is.
Upvotes: 0