Federico Nafria
Federico Nafria

Reputation: 1600

Logstash is putting old data in Elasticsearch. Can't clean Elasticsearch data

I have Logstash reading from a log file status.log and sending the output to an Elasticsearch instance.

I want to clean the data in Elasticsearch, for that I'm executing curl -XDELETE 'http://localhost:9200/index_name/_all'. If I check through the head plugin, the data is gone.

If that was not enough I'm also cleaning the log file with echo "" > status.log.

When I execute the application again, the old data reappears in Elasticsearch but with an updated @timestamp. The data is not present again in the status.log. The new data is inserted correctly in Elasticsearch.

How can I get rid of the old data? Is it still stored in Elasticsearch or Logstash has some kind of cache?

Upvotes: 0

Views: 495

Answers (1)

markus
markus

Reputation: 1651

assuming you are working with the file plugin. If you add the stdout plugin to your output section like so:

stdout { codec => rubydebug }

logstash will display each processed log element on the console. When using the file plugin every processed logmessage gets a path field telling you were logstash read the message from. maybe that helps you to find out where the messages are coming from...

Upvotes: 1

Related Questions