Reputation: 423
We have a VPN that we sometimes use to connect to sites remotely only to find that the chunked encoding tomcat seems to use often ends up with a lot of JavaScript files being truncated. In an attempt to remedy this I wanted to build a Tomcat filter for JS files that sends them in one shot.
By reading through the articles here and here I tried to take my own shot at this and tried implementing it as follows.
I hooked the filter into my context with a custom filter map of
new CustomFilterMap(
new String[]{"*.js"},
new String[]{"REQUEST"}
)
This part has actually worked well and my filter seems to be applied to JS files. Now the part I haven't gotten working so well is the actual filter.
public class NoChunkEncodedJSFilter extends OncePerRequestFilter {
@Override
protected void doFilterInternal(HttpServletRequest request,
HttpServletResponse response,
FilterChain filterChain) throws ServletException, IOException {
val unversionedServletPath = request.getRequestURI().substring(request.getRequestURI().indexOf("/js"));
val requestFilePath = getServletContext().getRealPath(unversionedServletPath);
if (requestFilePath != null) {
File requestedFile = new File(requestFilePath);
if (requestedFile.exists()) {
val fileSize = Math.toIntExact(requestedFile.length());
val out = response.getWriter();
val wrappedResponse = new NonEncodedResponse(response, fileSize);
filterChain.doFilter(request, wrappedResponse);
out.write(wrappedResponse.toString(), "UTF-8");
out.close();
return;
}
}
filterChain.doFilter(request, response);
}
}
With my NonEncodedResponse:
public class NonEncodedResponse extends HttpServletResponseWrapper {
private ByteArrayOutputStream baos;
public String toString() {
try {
return baos.toString("UTF-8");
} catch (UnsupportedEncodingException unsuportedEncodingException) {
return baos.toString();
}
}
/**
* Constructs a response adaptor wrapping the given response.
*
* @param response The response to be wrapped
* @throws IllegalArgumentException if the response is null
*/
public NonEncodedResponse(HttpServletResponse response, Integer fileSize) {
super(response);
this.setContentLength(fileSize);
this.setBufferSize(fileSize);
super.setContentLength(fileSize);
super.setBufferSize(fileSize);
baos = new ByteArrayOutputStream(fileSize);
}
@Override
public PrintWriter getWriter(){
return new PrintWriter(baos);
}
}
Now, originally I tried not wrapping the response at all and just calling response.setBufferSize(fileSize);
and response.setContentLength(fileSize);
but this seemed to have 0 actual effects on my output and when I looked at the headers I was still using a chunked transfer encoding without a fixed Content-Length. (I also tried setting a custom header like this and didn't see it appended on my response either. I'm assuming the response that goes into the filter in it's base form is some form of read only.)
I also tried using my wrapper and bypassing its output stream by reading and sending the bytes straight from the file
val fileContents = Files.readAllBytes(Paths.get(requestFilePath));
out.write(new String(fileContents));
because it seemed like even though my attempts to fix the content-length had failed I was still only seeing parts of files being sent as the whole thing in my browsers network tab while trying to debug.
All in all now I have it serving files like this (probably not ideal) and even less ideally it still says Transfer-Encoding: chunked
despite all my efforts to put a fixed Content-Length on my wrapper and original response. I can now put a custom header on my content, so I know it's running through my filter. It seems like it's still just chunk encoded for some reason.
Can someone please tell me what I'm doing wrong or if filters can even be used to disable chunked encoding to begin with (I believe I saw things suggesting they could from google searches, but the truth is I just don't know). I would happily welcome any and all advice on this issue I no longer have ideas to try and I certainly don't know what I'm doing.
Upvotes: 1
Views: 2412
Reputation: 423
So I actually managed to get this working by reverting to my original approach and ditching my wrapper entirely and moving the filterChain.doFilter
to the end of my code block. I'm not really sure why this works compared to what I was doing because I honestly have to confess I don't know what filterChain.doFilter
actually does at all. This was my final end result that got things working.
public class NoChunkEncodedJSFilter extends OncePerRequestFilter {
@Override
protected void doFilterInternal(HttpServletRequest request,
HttpServletResponse response,
FilterChain filterChain) throws ServletException, IOException {
val requestFilePath = this.getRealFilePath(request.getRequestURI());
File requestedFile = new File(requestFilePath);
if (requestedFile.exists()) {
val fileSize = Math.toIntExact(requestedFile.length());
val fileContents = Files.readAllBytes(Paths.get(requestFilePath));
response.reset();
response.setHeader("Content-Length", String.valueOf(fileSize));
response.setContentLength(fileSize);
ServletOutputStream sos = response.getOutputStream();
sos.write(fileContents);
sos.close();
}
filterChain.doFilter(request, response);
}
private String getRealFilePath(String requestURI) {
val unversionedServletPath = requestURI.substring(requestURI.indexOf("/js"));
return getServletContext().getRealPath(unversionedServletPath);
}
}
Now my only remaining question is, is there a smarter way to get my buffer data for my file, or do I have to re-read the file from disk every time? I'm assuming somewhere in there my file has probably already been loaded into memory ready to be served. Here I am loading it into memory a second time and I imagine this isn't very performant.
Upvotes: 1