Erik Warming
Erik Warming

Reputation: 224

Data transformation in logstash

I have a data source, a read-only SQL server, and by combining tables from this server, I am making a log, that I need to upload it Elastic Search.

To do this, I have an API call to the data source, and then have the data transformation happen in Logstash, and then upload it to ES.

I have done the data transformation several times before, in SQL. In SQL I would JOIN several tables and INSERT the query results into a log table, but I don't have SQL option in this setup, I need to do the transformation in logstash.

What I am asking for is best-practice suggestions for logstash.

Upvotes: 1

Views: 358

Answers (1)

Croos Nilukshan
Croos Nilukshan

Reputation: 154

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.38-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost/student?user=root&password="
    jdbc_user => "Croos"
    parameters => {
    }

    schedule => "* * * * *"
    statement => "SELECT * from subject WHERE id > :sql_last_value"
    use_column_value => true
    tracking_column => id
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

This link may help full to you.

Upvotes: 1

Related Questions