Reputation: 3874
I am currently under development and using the following configuration for a logstash agent:
input {
file {
type => "access_log"
# Wildcards work, here :)
path => [ "/root/isaac/vuforia_logs/access_logs/gw_access_log.2014-02-19.00.log"]
start_position => "beginning"
}
}
filter {
if [type] == "access_log" {
grok {
pattern => "\[%{DATA:my_timestamp}\] %{IP:client} %{WORD:method} %{URIPATHPARAM:request} \[%{DATA:auth_data}\] \[%{DATA:another_timstamp}\] %{NUMBER:result_code} %{NUMBER:duration} %{NUMBER:bytes}"
}
}
}
output {
stdout { debug => true }
elasticsearch_http {
host => "192.168.79.128"
}
The very first time it reads the file, it will process it and log to stdout and elasticsearch. The problem is when I restart logstash it does not do anything which I presume is because it kept the last position where logstash stopped last time. I am interesting in resetting logstash such that it would re-process the file from the beginning. This is for development and testing purposes, is there a way to reset/clean the logstash state?
Thx
Upvotes: 6
Views: 5845
Reputation: 4858
For those who got here with jdbc
, you need last_run_metadata_path
.
Upvotes: 3
Reputation: 7890
Logstash will record a sincedb for the input file. The default path is your $HOME directory. You can visit here for more information.
If you want to reset/clean it, you can delete all the .sincedb_* in your $HOME. Then when you restart logstash, it will read from the beginning.
Upvotes: 9