ceth
ceth

Reputation: 45285

Set normalizer for field

Here is my query:

{  
    "settings": {
    "analysis": {
      "analyzer": {
        "folding": {
          "tokenizer": "whitespace",
          "filter":  [ "lowercase", "asciifolding" ]
        }
      },
      "normalizer": {
        "lowerasciinormalizer": {
          "type": "custom",
          "filter":  [ "lowercase", "asciifolding" ]
        }
      }
    }
  },
  "mappings": {
             "entity": {
                 "properties": {

                     "Description": {
              "type": "text",
                "analyzer": "whitespace",
                          "normalizer": "lowerasciinormalizer"
              },

                    "Name": {
              "type": "text",
                "analyzer": "whitespace",
                    "normalizer": "lowerasciinormalizer"
            }
            }
     }
    }
}

entity is a type in the index. I get error message:

{
  "error": {
    "root_cause": [
      {
        "type": "mapper_parsing_exception",
        "reason": "Mapping definition for [Description] has unsupported parameters:  [normalizer : lowerasciinormalizer]"
      }
    ],
    "type": "mapper_parsing_exception",
    "reason": "Failed to parse mapping [entity]: Mapping definition for [Description] has unsupported parameters:  [normalizer : lowerasciinormalizer]",
    "caused_by": {
      "type": "mapper_parsing_exception",
      "reason": "Mapping definition for [Description] has unsupported parameters:  [normalizer : lowerasciinormalizer]"
    }
  },
  "status": 400
}

How can I fix it ?

Upvotes: 0

Views: 1741

Answers (1)

Nishant
Nishant

Reputation: 7854

Normalizer guarantees that it produces a single token and can be applied to keyword type fields only and can't be applied for text type field. What you can do is add fields to property with a field of type keyword and apply tokenizer to it.

This is how you can modify the mapping:

{
  "settings": {
    "analysis": {
      "analyzer": {
        "folding": {
          "tokenizer": "whitespace",
          "filter": [
            "lowercase",
            "asciifolding"
          ]
        }
      },
      "normalizer": {
        "lowerasciinormalizer": {
          "type": "custom",
          "filter": [
            "lowercase",
            "asciifolding"
          ]
        }
      }
    }
  },
  "mappings": {
    "entity": {
      "properties": {
        "Description": {
          "type": "text",
          "analyzer": "whitespace",
          "fields": {
            "keyword": {
              "type": "keyword",
              "normalizer": "lowerasciinormalizer"
            }
          }
        },
        "Name": {
          "type": "text",
          "analyzer": "whitespace",
          "fields": {
            "keyword": {
              "type": "keyword",
              "normalizer": "lowerasciinormalizer"
            }
          }
        }
      }
    }
  }
}

Upvotes: 2

Related Questions