nitin tyagi
nitin tyagi

Reputation: 1186

ElasticSearch Bulk insertion gies error using RestHighLevelClient

I am getting an exception while using ElasticSearch bulk insertion using RestHighLevelRestClient.

ElasticsearchStatusException[Unable to parse response body]; nested: ResponseException[method [POST], host [http:x.com], URI [/_bulk?timeout=1m], status line [HTTP/1.1 413 Request Entity Too Large] {"Message":"Request size exceeded 104857600 bytes"}]; at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:1386) at org.elasticsearch.client.RestHighLevelClient$1.onFailure(RestHighLevelClient.java:1357) at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onDefinitiveFailure(RestClient.java:844) at org.elasticsearch.client.RestClient$1.completed(RestClient.java:548) at org.elasticsearch.client.RestClient$1.completed(RestClient.java:529) at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:122) at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:181) at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:448) at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:338) at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81) at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39) at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114) at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591) at java.lang.Thread.run(Unknown Source) Suppressed: ParsingException[Failed to parse object: expecting field with name [error] but found [Message]] at org.elasticsearch.common.xcontent.XContentParserUtils.ensureFieldName(XContentParserUtils.java:50) at org.elasticsearch.ElasticsearchException.failureFromXContent(ElasticsearchException.java:605) at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:169) at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1406) at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:1382) ... 19 more

I have searched on SO post and found that some people suggest to set http.max_content_length but i was not able to resolve it with this.Below is my code for saving data-

private void saveAll(BulkRequest bulkRequest, String indexName)
    {
        try {
            System.out.println("***Saving data into " + indexName + " of Size = " + bulkRequest.numberOfActions());
            // restHighLevelClient.bulk(bulkRequest,RequestOptions.DEFAULT);

             Builder builder = RequestOptions.DEFAULT.toBuilder();
             builder.addHeader("http.max_content_length", "500mb");

             RequestOptions requestOptions = builder.build();

            restHighLevelClient.bulkAsync(bulkRequest,requestOptions , new ActionListener<BulkResponse>()
            {

                @Override
                public void onResponse(BulkResponse response)
                {
                    System.out.println("Bulk Data sucessfully Saved in " + indexName + " ElasticIndex***");

                }

                @Override
                public void onFailure(Exception e)
                {
                    System.out.println("Issue in bulk data saving in " + indexName + " ElasticIndex***");
                    e.printStackTrace();

                }
            });
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

Upvotes: 0

Views: 5128

Answers (1)

nitin tyagi
nitin tyagi

Reputation: 1186

I have resolved the issue by reducing the size of BulkRequest. You can see in below code if dataModelList size id greater then 10000 then i am spliting that list into bunch of 5000 data and then saving into Elastic. Hope its help someone.

 if (dataModelList.size() > 10000) {
            List<List<BaseDataModel>> baseDataModels = Lists.partition(dataModelList, 5000);
            baseDataModels.stream().forEach(baseModels -> {
                BulkRequest bulkRequest = new BulkRequest();
                baseModels.stream().forEach(baseModel -> {
                    bulkRequest.add(getBaseDataModelIndexRequest(baseModel));
                });
                saveAll(bulkRequest, rawEventIndex);
            });
        }

Upvotes: 0

Related Questions