Reputation: 781
I havee 3 tables on mysql.
And indexed the data to elasticsearch.
It worked with below.(logstash config)
input {
jdbc {
jdbc_driver_library => "lib/mysql-connector-java-5.1.33.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
jdbc_user => "test"
jdbc_password => "test"
statement => "SELECT * FROM TABLE1"
schedule => "* * * * *"
type => "table1"
}
//
// two more inputs
//
}
output {
stdout {codec => rubydebug}
if [type] == "table1" {
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
document_type => "%{type}"
document_id => "%{id}"
template => "./template"
template_overwrite => true
}
}
//
// two more outputs
//
}
I deleted other two inputs and outputs, some rows from TABLE1 on mysql.
And started logstash.
It was not updated.
So, typed
curl -XDELETE 'http://localhost:9200/*'
and restared logstash again.
But three tables were added again with same rows on elasticsearch.
How to update elasticsearch data automatically?
Upvotes: 0
Views: 1141
Reputation: 9434
I guess you're missing out use column value, tracking column and it's type in your jdbc
input:
jdbc {
jdbc_driver_library => "lib/mysql-connector-java-5.1.33.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
jdbc_user => "test"
jdbc_password => "test"
statement => "SELECT * FROM TABLE1"
schedule => "* * * * *"
type => "table1"
use_column_value => true
tracking_column => "id" <-- this should be the incrementing value in your table
tracking_column_type => "numeric"
}
tracking_column is used in order to trace which column to map when updating values in your table. This should do the trick for you. Hope it helps!
Upvotes: 2