Saïd Belhadj
Saïd Belhadj

Reputation: 45

How could I update my data dynamically from MySQL with logstash (without duplicate)?

I have configured logstash.conf to insert dynamically the data of my database but the issue is :

when I change a line of my table that line is not updated in my index because I only insert new values after sql_last_value, I though about us a trigger but I don't sure how can I do it.

input {
  jdbc { 
    jdbc_connection_string =>"jdbc:mysql://localhost:3306/blog"
    jdbc_user =>"root"
    jdbc_password =>""
    jdbc_driver_library =>"C:\Users\saidb\Downloads\mysql-connector-java-5.1.47\mysql-connector-java-5.1.47.jar"
    jdbc_driver_class =>"com.mysql.jdbc.Driver"
    schedule =>"* * * * *"
    statement =>"SELECT * FROM blog_pro WHERE id >:sql_last_value"
    use_column_value =>true
    tracking_column =>id
    }
  }
output {
  elasticsearch {
    hosts =>"localhost:9200"
    index =>"blog_pro"
    document_type =>"data"
  }
}

Upvotes: 3

Views: 3852

Answers (1)

Dushmantha
Dushmantha

Reputation: 3051

If you use id to select rows you cannot do that. You have 2 options,

  1. select all the rows each time and send them into ES using query SELECT * FROM blog_pro, which I dont think is a good one to do based on you scenario.

  2. create a new column last_modified_time which contain the last modified timestamp of the record(row). then use that to filter rows. note the property tracking_column_type => "timestamp"

statement =>"SELECT * FROM blog_pro WHERE last_modiefied_time >:sql_last_value" use_column_value =>true tracking_column =>last_modified_time tracking_column_type => "timestamp"

here is full logstash config

input { 

 jdbc { 
    jdbc_connection_string =>"jdbc:mysql://192.168.3.57:3306/blog_pro"
    jdbc_user =>"dush"
    jdbc_password =>"dush"
    jdbc_driver_library =>"F:\logstash-6.2.2\bin\mysql-connector-java-5.1.6.jar"
    jdbc_driver_class =>"com.mysql.jdbc.Driver"
    schedule =>"* * * * *"
    statement =>"SELECT * FROM blog_pro WHERE last_modified_time  >:sql_last_value"
    use_column_value =>true
    tracking_column =>last_modified_time
    tracking_column_type => "timestamp"
    } 
 }

output 
{ 
    #output to elasticsearch    
    elasticsearch {
        hosts => [ "192.168.1.245:9201" ]
        action=>update
        # "%{id}" - > primary key of the table 
        document_id => "%{id}"
        doc_as_upsert =>true
    }

}

note you may need to clear your index and start indexing with this configuration. I tested this and works fine.

Elasticsearch version = 5.x.x

logstash version =6.2.2

Upvotes: 4

Related Questions