Samuel
Samuel

Reputation: 95

JMeter JDBC database testing - Max Wait (ms)

What is the best practice for Max Wait (ms) value in JDBC Connection Configuration? JDBC

I am executing 2 types of tests:

  1. 20 loops for each number of threads - to get max Throupught
  2. 30min runtime for each number of Threads - to get Response time

With Max Wait = 10000ms I can execute JDBC request with 10,20,30,40,60 and 80 Threads without an error. With Max Wait = 20000ms I can go higher and execute with 100, 120, 140 Threads without an error. It seems to be logical behaviour.

Now question. Can I increase Max Wait value as desired? Is it correct way how to get more test results? Should I stop testing and do not increase number of Threads if any error occur in some Report? I got e.g. 0.06% errors from 10000 samples. Is this stop for my testing? Thanks.

Upvotes: 1

Views: 1807

Answers (2)

Dmitri T
Dmitri T

Reputation: 168197

This setting maps to DBCP -> BasicDataSource -> maxWaitMillis parameter, according to the documentation:

The maximum number of milliseconds that the pool will wait (when there are no available connections) for a connection to be returned before throwing an exception, or -1 to wait indefinitely

It should match the relevant setting of your application database configuration. If your goal is to determine the maximum performance - just put -1 there and the timeout will be disabled.

In regards to Is this stop for my testing? - it depends on multiple factors like what application is doing, what you are trying to achieve and what type of testing is being conducted. If you test database which orchestrates nuclear plant operation than zero error threshold is the only acceptable. And if this is a picture gallery of cats, this error level can be considered acceptable.

In majority of cases performance testing is divided into several test executions like:

  1. Load Testing - putting the system under anticipated load to see if it capable to handle forecasted amount of users
  2. Soak Testing - basically the same as Load Testing but keeping the load for a prolonged duration. This allows to detect e.g. memory leaks
  3. Stress testing - determining boundaries of the application, saturation points, bottlenecks, etc. Starting from zero load and gradually increasing it until it breaks mentioning the maximum amount of users, correlation of other metrics like Response Time, Throughput, Error Rate, etc. with the increasing amount of users, checking whether application recovers when load gets back to normal, etc.

See Why ‘Normal’ Load Testing Isn’t Enough article for above testing types described in details.

Upvotes: 1

Naveen Kumar R B
Naveen Kumar R B

Reputation: 6398

Everything depends on what your requirements are and how you defined performance baseline.

Can I increase Max Wait value as desired? Is it correct way how to get more test results?

  • If you are OK with higher response times and the functionality should be working, then you can keep max time as much as you want. But, practically, there will be the threshold to response times (like, 2 seconds to perform a login transaction), which you define as part of your performance SLA or performance baseline. So, though you are making your requests successful by increasing max time, eventually it is considered as failed request due to high response time (by crossing threshold values)

Note: Higher response times for DB operations eventually results in higher response times for web applications (or end users)

Should I stop testing and do not increase number of Threads if any error occur in some Report?

  • Same applies to error rates as well. If SLA says, some % error rate is agreed, then you can consider that the test is meeting SLA or performance baseline if the actual error rate is less that that. eg: If requirements says 0% error rate, then 0.1% is also considered as failed.

Is this stop for my testing?

  • You can stop the test at whatever the point you want. It is completely based on what metrics you want to capture. From my knowledge, It is suggested to continue the test, till it reaches a point where there is no point in continuing the test, like error rate reached 99% etc. If you are getting error rate as 0.6%, then I suggest to continue with the test, to know the breaking point of the system like server crash, response times reached to unacceptable values, memory issues etc.

Following are some good references:

  1. https://www.nngroup.com/articles/response-times-3-important-limits/
  2. http://calendar.perfplanet.com/2011/how-response-times-impact-business/
  3. difference between baseline and benchmark in performance of an application
  4. https://msdn.microsoft.com/en-us/library/ms190943.aspx
  5. https://msdn.microsoft.com/en-us/library/bb924375.aspx
  6. http://searchitchannel.techtarget.com/definition/service-level-agreement

Upvotes: 2

Related Questions