M a m a D
M a m a D

Reputation: 2139

Elasticsearch: illegal_argument_exception - Failed to bulk insert with elastic dump

There are 23000 records fetch from Database and formatted in to JSON. I need to insert each one as a document in Elasticsearch and autocomplete_entities index. The input file looks like this

[
    {
        "id" : 1,
        "title" : "x"
    },
    {
        "id" : 2,
        "title" : "y"
    },
    ...
]

This is the bash code I am working with

elasticdump --input=PycharmProjects/untitled/vendors.json --output=http://localhost:9200 /autocomplete_entities/_doc --type=data --transform="doc._source=Object.assign({},doc)" --limit=1000

I got this code from this link. After running the code I receive this error:

Thu, 01 Oct 2020 10:57:34 GMT | starting dump
Thu, 01 Oct 2020 10:57:34 GMT | Will modify documents using these scripts: doc._source=Object.assign({},doc)
Thu, 01 Oct 2020 10:57:35 GMT | got 1 objects from source file (offset: 0)
{ _index: 'autocomplete_entities',
_type: '_doc',
_id: 'bpnP43QB8j0CMKYkavN7',
status: 400,
error:
{ type: 'illegal_argument_exception',
    reason:
    'Limit of total fields [1000] in index [autocomplete_entities] has been exceeded' } }
Thu, 01 Oct 2020 10:57:41 GMT | sent 1 objects to destination elasticsearch, wrote 0
Thu, 01 Oct 2020 10:57:41 GMT | got 0 objects from source file (offset: 1)
Thu, 01 Oct 2020 10:57:41 GMT | Total Writes: 0
Thu, 01 Oct 2020 10:57:41 GMT | dump complete

Upvotes: 0

Views: 697

Answers (1)

Amit
Amit

Reputation: 32386

In your index, you crossed the limit of 1000 fields which caused the exception, please refer this ES doc for more info,

You can also change this default limit and fix the issue, although if its not intended please try to fix it as more than 1000 fields in index can cause performance and several other issues.

Request to increase the limit

PUT http://localhost:9200//_settings

{
    "index.mapping.total_fields.limit" : 2000
}

Upvotes: 1

Related Questions