Deano
Deano

Reputation: 12190

elasticsearch bulk insert JSON file

I have the following JSON file

I have used awk to get rid of empty spaces, trailing, next line

awk -v ORS= -v OFS= '{$1=$1}1' data.json

I have added a create request at the top of my data.json followed by \n and the rest of my data.

{"create": {"_index":"socteam", "_type":"products"}} 

When I issue bulk submit request, I get the following error

CURL -XPUT http://localhost:9200/_bulk

{
  "took": 1,
  "errors": true,
  "items": [
    {
      "create": {
        "_index": "socteam",
        "_type": "products",
        "_id": "AVQuGPff-1Y7OIPIJaLX",
        "status": 400,
        "error": {
          "type": "mapper_parsing_exception",
          "reason": "failed to parse",
          "caused_by": {
            "type": "not_x_content_exception",
            "reason": "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"
          }
        }
      }
    }
  ]

Any idea on what this error mean? I haven't created any mapping, I'm using vanilla elasticsearch

Upvotes: 1

Views: 3471

Answers (1)

cn0047
cn0047

Reputation: 17051

Accordingly to this doc, you have to specify index and type in URL:

curl -XPUT 'localhost:9200/socteam/products/_bulk?pretty' --data-binary "@data.json"

It works for PUT and POST methods.
And your data.json file should have structure like:

{"index":{"_id":"1"}}
{"name": "John Doe" }
{"index":{"_id":"2"}}
{"name": "Jane Doe" }

Maybe there present another method to import data, but i know just this... Hope it'll help...

Upvotes: 1

Related Questions