Code Guru
Code Guru

Reputation: 15578

how to use stopwords analyzer in elasticsearch

I am new to elasticsearch and used logstash-jdbc-input to load data from a mysql to an elasticsearch server. Here is the logstash config

input {
    jdbc {
        jdbc_connection_string => "jdbc:mysql://localhost:3306/news"
        jdbc_user => "root"
        jdbc_password => "sunilgarg"
        jdbc_validate_connection => true
        jdbc_driver_library => "../jars/mysql-connector-java-5.1.21.jar"
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        statement => "SELECT * from news"
    }
}
output {
    stdout {codec => json_lines}
    elasticsearch {
        "index" => "news"
        "document_type" => "news"
        "hosts" => "localhost:9200"
        "document_id" => "%{id}"
    }
}

After putting all the data from database to elasticsearch server. I updated analyzer settings by closing index and then updated the analyzer using PUT

{
  "settings": {
    "analysis": {
      "analyzer": {
        "my_english_analyzer": {
          "type": "standard",
          "max_token_length": 5,
          "stopwords": "_english_"
        }
      }
    }
  }
}

After opening the index again using /news/_open POST request, I still can search using stop words like the,of,was etc.

What is the issue? Am i doing something wrong?

Upvotes: 0

Views: 51

Answers (1)

mel
mel

Reputation: 2790

The analyser is applied at indexation time. You should define your mapping first then index your documents.

Upvotes: 1

Related Questions