Reputation: 5875
My use case is:
Which means a logstash file that (theoretically) would look something like:
input {
jdbc {
get the timestamp
}
jdbc {
Do the SQL that gets lots of data with the timestamp above
}
}
output {
elasticsearch {
spew the data from that second jdbc query
}
}
That doesn't work, of course, but it gives the idea of the use case. How can I make solve this scenario?
Upvotes: 2
Views: 301
Reputation: 4072
Use a jdbc_streaming filter, which is designed for exactly this use-case.
Upvotes: 1