Ganga
Ganga

Reputation: 923

Logstash custom log parsing

Need your help in custom log parsing through logstash

Here is the log format that I am trying to parse through logstash

2015-11-01 07:55:18,952 [abc.xyz.com] - /Enter, G, _null, 2702, 2, 2, 2, 2, PageTotal_1449647718950_1449647718952_2_App_e9c00521-eeec-4d47-bf5b-b842ec14a4ff_178.255.153.2___, , , NEW, 

And my logstash conf file looks like below

input {
  file {
    path => [ "/tmp/access.log" ]
  }
}

filter{
    grok {
       match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:message}" }
   }
   date {
    match => ["timestamp","yyyy-MM-dd HH:mm:ss,SSSS"]
  }
}

For some reason running the logstash command passing the conf file doesnt parse the logs, not sure whats wrong with the config. Any help would be highly appreciated.

bin/logstash -f conf/access_log.conf
Settings: Default filter workers: 6
Logstash startup completed

Upvotes: 0

Views: 1424

Answers (1)

Stefano Bossi
Stefano Bossi

Reputation: 1447

I have checked your Grok Match filter and is fine with: Grok Debugger

You don't have to use the date matcher because the grok matcher already correctly match the TIMESTAMP_ISO8601 timestamp.

I think your problem is with "since_db" file. Here is the documentation: since_db

In few words, logstash remember if a file is already read and doesn't read it anymore. Logstash remember if one file was already read because write it in the since Database. If you would like to test your filter reading always the same file, you could try:

    input {
      file {
        path => [ "/tmp/access.log" ]
        sincedb_path => "/dev/null"
      }
    }

Regards

Upvotes: 1

Related Questions