Joel Holmes
Joel Holmes

Reputation: 1053

Elasticsearch Large Bulk Upload Query

I'm getting an error trying to upload a json file using Elasticsearch api but I keep getting this error:

 Caught exception while handling client http traffic, closing connection [id: 0x0d08b235, /172.17.0.1:33780 => /172.17.0.2:9200]
org.jboss.netty.handler.codec.frame.TooLongFrameException: HTTP content length exceeded 104857600 bytes.
    at org.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:169)
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)

Is there a way to chunk this easily?

Upvotes: 0

Views: 295

Answers (1)

Andrei Stefan
Andrei Stefan

Reputation: 52368

You'd need to increase the http.max_content_length value to something greater than the default (100MB): https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-http.html

BUT you need to be careful with these values and not increase them too much. The bulk operations that arrive at a node will be held temporarily in a queue (if there are too many of them at the same time) in a memory buffer before being split and sent to the appropriate nodes for further processing. So, if you have too many concurrent bulk operations and these are large enough they could use a lot of memory.

Upvotes: 1

Related Questions