Reputation: 3302
My doubt is regarding the working of the Asynchronous JAX-RS which is kind of new to me and I'm trying to grasp its advantage. What I understand is that the client sends a request, the request is delegated from the requestor thread to the worker thread and once the processing is completed the response is sent back to the client using the AsyncResponse. What I've also understood is that throughout the process the client waits for a response from the server. (So as far as the client is concerned its the same as a normal Synchronous request)
It also states that as the request thread is sent to a worker thread for further processing and therefore by having this approach the I/O threads are free to accept new connections.
What I did not understand is, the client is still waiting for a response and therefore an active connection is still maintained between the client and server. Is this not maintained in an I/O thread? What does suspended mean here?
Also, even if the case is that the I/O thread is released because of the delegation of the process to a worker connection with the client is still up how can the server accept more and more connections then?
And my next question is about the thread pool used here. The I/O threads and worker threads are from different pools? are the worker/processor threads not coming from a pool managed by the server?
Because of my failure to understand this, my next pondering is, just having a separate pool for the I/O and the processing with the client connection still up is the same as having the I/O blocked with the processing inside right?
I haven't grasped this concept very well.
Upvotes: 0
Views: 576
Reputation: 2374
The thread pools in use in this scenario are:
There is possibly an IO thread associated with the connection, but that's an implementation detail that doesn't affect this.
When you use AsyncResponse
, as soon as you return from your handle method, the request processing thread (from pool #1) is freed and can be used by the container to handle another request.
On to your questions:
AsyncResponse
for long-running requests, because you are freeing up one of your limited resources (threads in thread pool #1). The connection itself is not freed, and connections are also limited resources, so you can run out of those still (as well as possibly being limited by CPU or memory).ExecutorService
or similar, but the point is that you manage the "worker thread" yourself. The exception here is if you use Jersey's @ManagedAsync
annotation.AsyncResponse
, but AsyncResponse
does free up the container's request processing threads, which can be more limited in number than the maximum number of connections. You may choose to handle this problem by changing the server configuration instead of using AsyncResponse
, but AsyncResponse
has two advantages - it is under the application's control, and it is per-request instead of per-server.Upvotes: 2