Reputation: 741
I'm using SparkJava 2.2 which is using Jetty 9.0.2.
I'm getting "Form too large" exception which is thrown by Jetty. I already know how to solve this problem if I was using Jetty directly:
http://www.eclipse.org/jetty/documentation/current/setting-form-size.html
PROBLEM :
Now I need to find a way to change org.eclipse.jetty.server.Request.maxFormContentSize
setting through SparkJava. Is there a way to do this?
I must note that other methods (JVM_OPTS, System.setProperty) do not work for me for some reason. I'm still getting the same exception.
Stacktrace:
[qtp1858644635-27] ERROR spark.webserver.MatcherFilter -
java.lang.IllegalStateException: Form too large 308913>200000
at org.eclipse.jetty.server.Request.extractParameters(Request.java:334)
at org.eclipse.jetty.server.Request.getParameterMap(Request.java:765)
at javax.servlet.ServletRequestWrapper.getParameterMap(ServletRequestWrapper.java:193)
at spark.QueryParamsMap.<init>(QueryParamsMap.java:59)
at spark.Request.initQueryMap(Request.java:364)
at spark.Request.queryMap(Request.java:349)
at spark.webserver.RequestWrapper.queryMap(RequestWrapper.java:213)
at com.xyz.analytics.webservice.RequestTools.getRequestQueryMap(RequestTools.java:27)
at com.xyz.analytics.webservice.RequestTools.getMandrillQueryParams(RequestTools.java:22)
at com.xyz.analytics.webservice.Endpoints.lambda$initiateEndpointsAndExceptionHandlers$2(Endpoints.java:61)
at com.xyz.analytics.webservice.Endpoints$$Lambda$3/1485697819.handle(Unknown Source)
at spark.SparkBase$1.handle(SparkBase.java:311)
at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:159)
at spark.webserver.JettyHandler.doHandle(JettyHandler.java:60)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:179)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:136)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:451)
at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:252)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:266)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.run(AbstractConnection.java:240)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:596)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:527)
at java.lang.Thread.run(Thread.java:745)
I must note that other methods (JVM_OPTS, System.setProperty) do not work for me.
Well, debugger doesn't even stop at any breakpoint set within org.eclipse.jetty.server.handlerContextHandler
... Plus when it stops at org.eclipse.jetty.server.Request
breakpoints, _context property is null. Seems that SparkJava is handling it differently. Dead end.
Request does one more thing before setting maxFormContentSize = 200000;
. It checks _channel.getServer().getAttribute("org.eclipse.jetty.server.Request.maxFormContentSize").
Except Server's attribute collection is empty... And I don't see any way to add any attribute. Jetty Server is created by SparkBase.init() which calls SparkServer.ignite().
But it doesn't help us much. It's not easy to "break in" to make our own adjustments. It seems pretty hopeless.
Upvotes: 6
Views: 5082
Reputation: 741
Good news everyone :) In Spark 2.6 (released April 2017) embedded Jetty is fully configurable! Release notes: http://sparkjava.com/news#spark-26-released
See the original future request for more details here: https://github.com/perwendel/spark/issues/314 and related pull request here: https://github.com/perwendel/spark/pull/813
NOTE that it is also possible to run Spark on another web server instead of the embedded Jetty server: http://sparkjava.com/documentation#other-web-server
Upvotes: 3
Reputation: 212
Well, having access to the server object, you can always do something like:
server.setAttribute("org.eclipse.jetty.server.Request.maxFormContentSize", 1024 * 1024);
Hope this helps!
Upvotes: 0
Reputation: 49515
Not possible with Spark 2.2
The creation of the ServerConnector is hardcoded in the SparkServer, you cannot change those values after the fact, they have to be passed into the ServerConnector before server start.
Would recommend filing a bug with Spark to make that configurable.
https://github.com/perwendel/spark/issues
Upvotes: 4