mangesh shinde
mangesh shinde

Reputation: 1

indexing updated log lines in logstash

I have use logstash file input for getting logs indexed to elastic search and then to kibana. I used grok filter to get some field and getting proper data but when I update the log file that is adding few lines in existing file. Logstash scanning that file from first line. I want only updated lines not already scanned lines <>

input{
   file {
        path => ""
        codec => multiline{
                 pattern => "^[0-9]{4}-[0-9]{2}-[0-9]{2}"
                 negate => true
             what => "previous"
      }
   }
}
filter{
      grok{
          match => {"message" => "(?<time_stamp>[\d\-\s\:\,]*)\s%{WORD:log_level}\s%{JAVACLASS:class}\s(\[%{DATA:thread}\])\s+(?<msg>(.|\r\n)*)"}
   }
   mutate{
      copy => {"msg" => "message"}
}
   grok{
    match => {"path" => "(?<index_name>[^\\/]+?(?=\.\w+$))"}
   }
}
output{
     stdout{
        codec => rubydebug
   }
   elasticsearch{
     hosts => ["hhtp://*.*.*.*:9200/"]
     index => ""

}
}

Upvotes: 0

Views: 168

Answers (1)

Filip
Filip

Reputation: 661

If I understood correctly you want to read just updates from files. This can be achieve with tail mode in logstash.

In this mode the plugin aims to track changing files and emit new content as it’s appended to each file.

Besides that, you have option to choose to start from beginning or end in case you're using tail mode, more here.

Progress of tail mode is saved to sincedb file in case logstash or machine is restarted it will continue from the last processed record.

Upvotes: 0

Related Questions