nick
nick

Reputation: 57

Elasticsearch index tokenizer keyword not working

I have a indices with fields like:

"Id":{"type":"string","analyzer":"string_lowercase"} // guid for example

In elasticsearch.yml:

index:
    analysis:
        analyzer:
            string_lowercase:
                tokenizer: keyword
                filter: lowercase

But filtering like this

{
  "filter": {
    "term": {
      "Id": "2c4294c2-ca84-4f69-b648-8a014ff6e55d"
    }
  }
}

is not working for a whole guid value, only parts ("2c4294c2","ca84",..)

Interestingly, on other machine it work properly with same configuration.

Upvotes: 0

Views: 712

Answers (1)

bittusarkar
bittusarkar

Reputation: 6357

You can't add a custom analyzer through elasticsearch.yml. There is a REST API for adding a custom analyzer. For your requirement, below is the required command:

PUT <index name>
{
   "settings": {
      "analysis": {
         "analyzer": {
            "string_lowercase": {
               "type": "custom",
               "tokenizer": "keyword",
               "filter": "lowercase"
            }
         }
      }
   }
}

Upvotes: 1

Related Questions