ArthurTheLearner
ArthurTheLearner

Reputation: 315

How do I fix the whitespace analyzer error?

I've tried importing a mapping using elasticdump with the following command:

elasticdump --input=myMapping.json --output=http://localhost:9200/myIndex --type=mapping

Getting the following output:

Fri, 18 Dec 2015 17:53:05 GMT | starting dump
Fri, 18 Dec 2015 17:53:05 GMT | got 1 objects from source file (offset: 0)
Fri, 18 Dec 2015 17:53:05 GMT | Error Emitted => {"root_cause":[{"type":"mapper_parsing_exception","reason":"analyzer [whitespace_analyzer] not found for field [_all]"}],"type":"mapper_parsing_exception","reason":"analyzer [whitespace_analyzer] not found for field [_all]"}
Fri, 18 Dec 2015 17:53:05 GMT | Total Writes: 0
Fri, 18 Dec 2015 17:53:05 GMT | dump ended with error (set phase)  => [object Object]

I've googled this and I can't find this question answered. Any ideas?

EDIT: MyMapping.json

[
"{\"myIndex\":{\"mappings\":{\"favourites\":{\"_all\":{\"index_analyzer\":\"nGram_analyzer\",\"search_analyzer\":\"whitespace_analyzer\"},\"_timestamp\":{\"enabled\":true,\"store\":true},\"properties\":{\"thing\":{\"properties\":{\"type_one_id\":{\"type\":\"long\",\"include_in_all\":false},\"type_two_id\":{\"type\":\"string\"},\"type_three_id\":{\"type\":\"string\"},\"att_one\":{\"type\":\"long\"},\"att_two\":{\"type\":\"string\"},\"att_three\":{\"type\":\"long\"},\"att_four\":{\"type\":\"string\"},\"att_five\":{\"type\":\"long\"},\"att_six\":{\"type\":\"string\"},\"att_seven\":{\"type\":\"long\",\"include_in_all\":false},\"att_eight\":{\"type\":\"string\"},\"att_nine\":{\"type\":\"long\",\"include_in_all\":false},\"att_ten\":{\"type\":\"long\",\"include_in_all\":false},\"att_eleven\":{\"type\":\"long\",\"include_in_all\":false},\"att_twelve\":{\"type\":\"string\"},\"att_thirteen\":{\"type\":\"string\"},\"att_fourteen\":{\"type\":\"string\"},\"att_fifteen\":{\"type\":\"string\"},\"att_sixteen\":{\"type\":\"long\",\"include_in_all\":false},\"seventeen\":{\"type\":\"string\"},\"eighteeen\":{\"type\":\"long\",\"include_in_all\":false},\"nineteen\":{\"type\":\"long\",\"include_in_all\":false},\"twenty\":{\"type\":\"long\",\"include_in_all\":false},\"twenty_one\":{\"type\":\"long\"}}},\"uuid\":{\"type\":\"string\",\"index\":\"not_analyzed\",\"include_in_all\":false},\"versionId\":{\"type\":\"long\"},\"version_id\":{\"type\":\"long\",\"include_in_all\":false}}}}}}"
]

Upvotes: 1

Views: 854

Answers (1)

ChintanShah25
ChintanShah25

Reputation: 12672

You first need to define your nGram_analyzer and whitespace_analyzer, you are getting the error because ES is not able to find them, so create your index like(change this according to your requirements)

POST myIndex
{
  "settings": {
    "analysis": {
      "analyzer": {
        "nGram_analyzer": {
          "tokenizer": "standard",
          "filter": [
            "lowercase",
            "asciifolding",
            "ngram_filter"
          ]
        },
        "whitespace_analyzer": {
          "tokenizer": "whitespace"
        }
      },
      "filter": {
        "ngram_filter": {
          "type": "nGram",
          "min_gram": 2,
          "max_gram": 8
        }
      }
    }
  }
}

Then your command will work, this is the output I got

Sat, 19 Dec 2015 02:55:24 GMT | starting dump
Sat, 19 Dec 2015 02:55:24 GMT | got 1 objects from source file (offset: 0)
Sat, 19 Dec 2015 02:55:24 GMT | sent 1 objects to destination elasticsearch, wrote 1
Sat, 19 Dec 2015 02:55:24 GMT | got 0 objects from source file (offset: 1)
Sat, 19 Dec 2015 02:55:24 GMT | sent 0 objects to destination elasticsearch, wrote 0
Sat, 19 Dec 2015 02:55:24 GMT | Total Writes: 1
Sat, 19 Dec 2015 02:55:24 GMT | dump complete

Upvotes: 2

Related Questions