Karthikeyan
Karthikeyan

Reputation: 2011

Elastic Search error : Custom Analyzer [custom_analyzer] failed to find tokenizer under name [my_tokenizer]

Am trying for field mapping along with my custom_analyzer & tokenizer but am getting some error.

Please find the error below that am getting from kibana while mapping fields

Custom Analyzer [custom_analyzer] failed to find tokenizer under name [my_tokenizer]

Please find my mapping details.

PUT attach_local
    {
        "settings": {
        "analysis": {
          "analyzer": {
            "custom_analyzer": {
              "type": "custom",
              "tokenizer": "my_tokenizer",
              "char_filter": [
                "html_strip"
              ],
              "filter": [
                "lowercase",
                "asciifolding"
              ]
            }
           }
          }
        },
        "tokenizer": {
        "my_tokenizer": {
          "type": "ngram",
          "min_gram": 3,    
          "max_gram": 3,
          "token_chars": [
            "letter",
            "digit"
          ]
        }
      },

      "mappings" : {
        "doc" : {
          "properties" : {
            "attachment" : {
              "properties" : {
                "content" : {
                  "type" : "text",
                  "analyzer": "custom_analyzer"
                },
                "content_length" : {
                  "type" : "long"
                },
                "content_type" : {
                  "type" : "text"
                },
                "language" : {
                  "type" : "text"
                }
              }
            },
            "resume" : {
              "type" : "text"
            }
          }
        }
      }
    }

Upvotes: 0

Views: 6502

Answers (1)

Val
Val

Reputation: 217554

It is very important to properly indent your JSON. You'd see that your tokenizer is not properly located inside the analysis section. Here is the right definition:

{
  "settings": {
    "analysis": {
      "analyzer": {
        "custom_analyzer": {
          "type": "custom",
          "tokenizer": "my_tokenizer",
          "char_filter": [
            "html_strip"
          ],
          "filter": [
            "lowercase",
            "asciifolding"
          ]
        }
      },
      "tokenizer": {
        "my_tokenizer": {
          "type": "ngram",
          "min_gram": 3,
          "max_gram": 3,
          "token_chars": [
            "letter",
            "digit"
          ]
        }
      }
    }
  },
  "mappings": {
    "doc": {
      "properties": {
        "attachment": {
          "properties": {
            "content": {
              "type": "text",
              "analyzer": "custom_analyzer"
            },
            "content_length": {
              "type": "long"
            },
            "content_type": {
              "type": "text"
            },
            "language": {
              "type": "text"
            }
          }
        },
        "resume": {
          "type": "text"
        }
      }
    }
  }
}

Upvotes: 2

Related Questions