Reputation: 4368
I am learning Azure Cognitive Search and got a bit confused about Analizer and Normilizer. https://learn.microsoft.com/en-us/azure/search/search-analyzers https://learn.microsoft.com/en-us/azure/search/search-normalizers
As far as I understood the only difference is the fact that Analyzers perform tockenization.
Could someone provide good example whene I should use one over antoher?
Thank you for your time!
Upvotes: 0
Views: 491
Reputation: 11
The values that you pass in a filter, sort, or facet can't be analyzed, so "normalizers" were created to fill that gap. They don't do everything that an analyzer can do (i.e., there is a smaller set of allowed tokenizers, token filters, and character filters) but they take care of the bigger issues, like normalizing text casing and getting rid of punctuation.
Upvotes: 1
Reputation: 5353
The simplest explanation is to use an analyzer for properties containing blocks of text. The normalizer is more suitable for properties with short content that you typically would use for filtering or sorting like City, Country, Name, etc.
A block of text will have content in a specific language. A language-specific analyzer will do a better job of producing good tokens for internal use by the search engine. You will find that you get better recall for textual content that is correctly processed using a relevant analyzer.
Upvotes: 2