Reputation: 1433
I have inherited an ASP.NET web application (WebForms web site) that performs very badly indeed. A simple look at the code reveals a encyclopedia of how not to write .Net apps (I'm talking string concatenation all over the place, database access inside loops, File I/O in master page_load...etc etc).
Although this application is going to get a major rewrite but I still need make this a bit more robust in the mean-time so I've been doing lots of load testing using LoadUIWeb 2. I have created a very simple scenario that simply logs in and waits on the home page.
I've done a bit of optimisation but I really don't understand the variance on the load test results.
For example, with my simple scenario and 10 virtual users I get results similar to:
Max Page Load - 12.28s Ave Page Load - 4.26s Min Page Load - 0.43s
The users do not ramp up and I do not get a nice curve. I see page load times 'randomly' peak and trough. How can the same page take between 0.4s and 12.2s? I would expect to less variance (e.g. always high or always medium etc).
I've done most of the obvious things such as caching,compressing,no debug mode,no tracing etc. I've even tried pre-compiling all to no avail.
What am I missing here?
Upvotes: 1
Views: 187
Reputation: 554
The variance really depends on a lot of things. You should try a really slow ramp up to 10VU over the course of a minute, to see when the variance really begins to increase dramatically.
Also, you may want to include server metrics (from both your database server and your file/web server) to see if it's a specific resource that's causing the delay.
Finally, check the reports 'top 10' and optimize the slowest routines first (if you have code or server administration access), as these often act in concert to produce wild and unexpected variances, even at low load.
You can also contact SmartBear pre-sales support, and they'll get you in contact with a tech that can help figure out what's really going on here.
Upvotes: 1