iam.Carrot
iam.Carrot

Reputation: 5286

Filter for my Custom Logs in Logstash

i am new to the ELK stack, I want to use ELK stack to push my logs to elastic so that I can use Kibana on em. Below is the format of my custom log:

Date Time INFO - searchinfo#username#searchQuery#latitude#longitude#client_ip#responseTime

The below is an example of a log that follows the format.

2017-07-04 11:16:10 INFO  - searchinfo#null#gate#0.0#0.0#180.179.209.54#598

Now I am using filebeat to push my .log files to logstash and logstash would push that data into elastic.

I need help, writing up a filter for config for logstash that would simply split using the # and then put data into respective fields into elastic index.

How can I do this?

Upvotes: 0

Views: 2212

Answers (2)

Sayalic
Sayalic

Reputation: 7530

Try to use grok plugin to parse your logs into structured data:

filter {
  grok {
    match => { "message" => "\A%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:var0}%{SPACE}%{NOTSPACE}%{SPACE}(?<searchinfo>[^#]*)#(?<username>[^#]*)#(?<searchQuery>[^#]*)#(?<latitude>[^#]*)#(?<longitude>[^#]*)#(?<client_ip>[^#]*)#(?<responseTime>[^#]*)" }
  }
}

You can debug it online:

enter image description here

Upvotes: 1

whites11
whites11

Reputation: 13340

You need to use a grok filter to parse your log.

You can try with this:

filter {
  grok {
    match => { "message" => "\A%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:var0}%{SPACE}%{NOTSPACE}%{SPACE}(?<var1>[^#]*)#(?<var2>[^#]*)#(?<var3>[^#]*)#(?<var4>[^#]*)#(?<var5>[^#]*)#(?<var6>[^#]*)#(?<var7>[^#]*)" }
  }
}

This will parse you log and add fields named var0, var1, etc to the parsed document. You can rename this variables as you prefer.

Upvotes: 1

Related Questions