Reputation: 835
What would be a good way to deal with receiving duplicate HTTP requests on the server?
I have reports on a LAMP web app which take ~ 30 seconds to build on the server and return to the client. Clients get impatient and run the report again before the first finishes. This bogs down the server. Is there any way to handle/prevent this server side?
Upvotes: 5
Views: 1047
Reputation: 2964
30 seconds is just way too long to let a user wait in todays web app world. Google searches the whole web in 1/100th of that time. What makes your web app more advanced or data consuming. If your one instance can't handle it even after optimization you better scale it out to the cloud (or more servers). If it's really so demanding, divide it into subtasks that can run in parallell from several instances.
For your direct question, rather than an answer to the question you did not ask; the answers above holds some tricks. Either you can generate the frequently requested queries before they even come - or you can make only the first request to the rendering and wait for that request to finish before you send the result.
Upvotes: 0
Reputation: 122659
Nowadays, telling the user that something is being processed tends to be done with AJAX.
Typically, you'd send the request to your server, which would then return a 202 response and an address (possibly with a UUID) for the browser to find the result (which you wouldn't necessarily display directly, but keep in the background of your script: you can also display this directly for clients that don't support JavaScript).
Then, you would make subsequent requests in the background to that address, and display the result when it's ready.
This approach not only has the advantage of being more user friendly, but it's also more robust against disconnections.
Upvotes: 1
Reputation: 143906
You can try returning a 202 response until whatever you are building on the server is done, then serve a cached copy of it.
Upvotes: 1
Reputation: 52372
Store the fact that a job is already running somewhere.
In your code that generates the report, check if one is already running. If so, don't run another.
When the report is done generating or some timeout expires to handle exceptional conditions, un-store that fact.
You can use a database, a memcached server, redis, a text file, write to shared memory...
Upvotes: 3