Nitesh
Nitesh

Reputation: 77

Data ingestion in elasticsearch with database having large number of tables

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "mysql"
    parameters => { "favorite_artist" => "Beethoven" }
    schedule => "* * * * *"
    statement => "SELECT * from songs where artist = :favorite_artist"
  }
}

In the above logstash configuration file how to ingest data?
What to do when I have to select multiple tables?

Upvotes: 0

Views: 303

Answers (1)

Sukanya Arumugam
Sukanya Arumugam

Reputation: 171

Data would be getting ingested based on the "Select statement query". if you want to have the data from multiple tables, then you can have join queries combining all the tables and the relevant output from the query would be ingested to ES.It all depends on your specific use case. Here is some sample pasted down for your reference.

input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
jdbc_user => "mysql"
parameters => { "favorite_artist" => "Beethoven" }
schedule => "* * * * *"
statement => "SELECT * FROM songs INNER JOIN song_folder using song_number ORDER BY 
song_title;"
}
}

output{
elasticsearch{
hosts=>"http://xx:XX:XX:XX:9200"
index=>"song"
document_type=>"songname"
document_id=>"song_title"
}
stdout{codec=>rubydebug}
}


Please let me know , if you have any further queries.

Upvotes: 1

Related Questions