Reputation: 507
Situation:
dailyImport.sh
docker exec -it $DOCKER_CONTAINER_NAME opt/logstash/bin/logstash --path.data /tmp/logstash/data -e \
'input {
file {
path => "'$OUTPUT_PATH'"
start_position => "beginning"
sincedb_path => "/dev/null"
mode => "read"
file_completed_action => "delete"
}
}
filter {
csv {
separator => ","
columns => ["foo", "bar", "foo2", "bar2"]
}
}
output {
elasticsearch{
hosts => "localhost:9200"
index => "foo"
document_type => "foo"
}
stdout {}
}'
What I have tried and understood:
read
mode and file_completed_action
to delete would stop the operation, I tried it but it didn't work.Ctrl + C
manually to stop the pipeline. e.g:^C[2019-02-21T15:49:07,787][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2019-02-21T15:49:07,899][INFO ][filewatch.observingread ] QUIT - closing all files and shutting down.
[2019-02-21T15:49:09,764][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x6bdb92ea run>"}
Done
I have read that I could do the following, but don't know how:
Reference: https://discuss.elastic.co/t/stop-logstash-after-processing-the-file/84959
What I want:
Ctrl + C
when it reaches the EOF or "finished importing".Upvotes: 8
Views: 4952
Reputation: 537
How about running a script in the output of dailyImport.sh
that keeps checking the expected data in Elasticsearch and then kill the Logstash process when it is done?
It is not too elegant but in my case it is just enough.
if curl 'http://eslasticsearch:9200/index/_count | grep -q "count: 100"; then
#Kill Logstash service so container would stop
kill $(ps aux | grep 'logstash' | awk '{print $2}')
break
else
echo "Counting documents from Elasticsearch does not return the expected number. Retrying"
sleep 2
fi
Upvotes: 0
Reputation: 7166
for file input, in read mode, there's a way to exit the process upon reading all files, just set:
input { file { exit_after_read => true } }
Upvotes: 4