Reputation: 627
I have a small web service that receives via POST
a JSON
with a set of strings, processes them and outputs the results in a JSON
format.
The thing is, the processing is very resource heavy (CPU and Memory) so I want to be able to queue the requests and process them one by one (or two by two, etc).
First I created my server with a QueuedThreadPool
like this:
QueuedThreadPool threadPool = new QueuedThreadPool(4, 1);
Server jettyServer = new Server(threadPool);
But this only limits the number of threads, and if one single thread makes multiple requests, the service crashes.
Next, I tried with a LinkedBlockingQueue
. Like this:
LinkedBlockingQueue<Runnable> queue = new LinkedBlockingQueue<Runnable>(2);
QueuedThreadPool threadPool = new QueuedThreadPool( 4, 1, 30000, queue);
Server jettyServer = new Server(threadPool);
Like this, it seems to work because the service only processes two sets of text at a time and does not crashes, but the problem is, the other requests are discarded and receive a 502 status code
. The server throws a RejectedExecutionException
and continues the execution.
Is there a way to configure a queue of requests and process them one by one but not discarding the rest? Like that, queue the requests but limit the number of requests served at a time.
Upvotes: 0
Views: 415
Reputation: 49462
Jetty does not have a 1 thread == 1 request model.
It is entirely possible for 1..n threads to be used over the lifetime of a single request/response exchange.
Don't attempt to control this at the connector or thread pool level, its not possible.
Instead, consider using the QoSFilter (Quality of Service Filter).
This lets you set the specific endpoint resource (your specific servlet that is performing these long running events) to only have a limited set of resources, leaving all other requests to process without the limitations you imposed via the filter.
Upvotes: 2