defnull
defnull

Reputation: 4199

Stream large responses with jersey, asynchronously

I want to allow clients (including very slow clients) to download large files from an JAX-RS (jersey) web service and I'm stuck. It seems like async fatures in JAX-RS do not support this.

Am I not seeing the obvious solution?

Upvotes: 7

Views: 5400

Answers (2)

osoitza
osoitza

Reputation: 181

I had similar problem like you. I needed to transfer big amounts of data between two instances of my application. Initially I used simple StreamingOutput approach but very soon I understood that this will not works as the client party was quite slower compared to server party and I have been getting TimeOutException. I was able to solve this by setting up my Grizzly server. In this way I can transfer with the StreamingOutput approach hundreds of megabytes.My code for setting the timeout is like this:

Collection<NetworkListener> listeners = server.getListeners(); 
for(NetworkListener listener : listeners) {
    final TCPNIOTransport transport = listener.getTransport();
    transport.setKeepAlive(true);
    transport.setWriteTimeout(0, TimeUnit.MINUTES);
}

Upvotes: 1

defnull
defnull

Reputation: 4199

With reasonably new versions of jersey and jetty, the following works:

  • Inject @Suspended AsyncResponse into your jax-rs request handler method. This tells jersey to enter async-mode and keep the request open.
  • Inject @Context HttpServletRequest to access servlet-level APIs.
  • Call HttpServletRequest.getAsyncContext() instead of HttpServletRequest.startAsync(), because jersey already switched to async mode and doing so again results in an IllegalStateException (that was my problem from above).
  • Use this AsyncContext as you'd do in a servlet environment. Jersey does not complain.
  • Once you are done, call AsyncContext.complete() and then AsyncResponse.cancel(). The latter is optional I think.

I managed to serve a 10GB file to 100 concurrent clients this way. The thread count never exceeded ~40 threads and memory consumption was low. The throughput was about ~3GB/s on my laptop, which is kinda impressive.

@GET
public void doAsync(@Suspended final AsyncResponse asyncResponse,
                    @Context HttpServletRequest servletRequest)
        throws IOException {
    assert servletRequest.isAsyncStarted();
    final AsyncContext asyncContext = servletRequest.getAsyncContext();
    final ServletOutputStream s = asyncContext.getResponse().getOutputStream();

    s.setWriteListener(new WriteListener() {

        volatile boolean done = false;

        public void onWritePossible() throws IOException {
            while (s.isReady()) {
                if(done) {
                    asyncContext.complete();
                    asyncResponse.isCancelled();
                    break;
                } else {
                    s.write(...);
                    done = true;
                }
            }
        }
    });
}

Upvotes: 7

Related Questions