Reputation: 13
I currently have apache running with web2py on windows using mod_wsgi and python 2.7.10. What I've noticed is when I have concurrent connections, the response time per request increases. This increase may go from 20ms for 1 connection to 200ms for 5 concurrent connections. If I get up to 20 concurrent connections, I have a response time of 800ms - 1s for a request that is only 545 B.
Would adding a front end like nginx help resolve this or is there something that can be changed in the apache config?
My current apache config limits are:
Threadlimit 100
ThreadsPerChild 100
MaxRequestsPerChild 10000
AcceptFilter http none
AcceptFilter https none
KeepAlive On
The code that is being executed is:
Javascript:
$(function () {
function refresh(){
$.get('/database/domath_stuff', {num:document.getElementById('mathstuff').innerHTML}, function (response) {
document.getElementById('mathstuff').innerHTML = response
})
}
window.setInterval(refresh, 1000);
});
The python is:
def domath_stuff():
number = int(request.vars.num)
number = number + 1
return number
Upvotes: 1
Views: 1714
Reputation: 13
Using Graham's feedback, I switched over to IIS and I am now getting 20 concurrent connections < 200ms response time which for my needs is alright. I followed the web2py documentation in setting up IIS, so it's configured "out of the box" for anyone with this same issue.
Upvotes: 0
Reputation: 58533
The nginx server, as suggested by others as being an improvement, wouldn't help at all if the issue is in your Python code. This is because the request is not handled in nginx but whatever separate Python web hosting mechanism you are using.
A problem may be occurring in the Python application due to incorrect locking of some resource in the application or framework, causing serialisation of request handling. CPU bound tasks can also cause issues due to the Python global interpreter lock.
You could well be exacerbating the problems due to a poor Apache MPM configuration. Your thread directive values are inconsistent. ThreadsPerChild
at 1000 is wrong as is greater than Threadlimit
. Using a high number of threads in Python applications is also a bad idea because of the Python GIL issues. One would tend to small number of threads and multiple processes.
Unfortunately you are using Windows, which is a very poor platform for running Python web applications. Your Apache cannot handle multiple processes.
Depending on what HTTP client you are using for testing, the use of KeepAlive
could also be complicating things and resulting in serialisation of requests.
Upvotes: 0
Reputation: 3288
nginx is typically faster than Apache, but with a low request server it hardly matters. There's dozens of different reasons why this could be happening; this is an incredibly common thing referred to as bottlenecking. The simplest explanation is that your application consumes more resources and accepts more concurrent transactions than what your server is able to handle, but since your requests are so low you can count them on one hand, the obvious answer is: your application is slow.
Upvotes: 1