krckumar
krckumar

Reputation: 544

Working on JSON based logs using logstash

I have a logs which contains logs as following format

{ "start_time" : "12-May-2011", "name" : "this is first heading", "message" : "HELLO this is first message" }
{ "start_time" : "13-May-2011", "name" : "this is second heading", "message" : "HELLO this is second message" }
{ "start_time" : "14-May-2011", "name" : "this is third heading", "message" : "HELLO this is third message" }
...

I am new to logstash, I am currently having an app that is creating this log entries as JSON strings one below the other in that file (say location as /root/applog/scheduler.log)

I m looking for some help on how to parse this json from the logs into different fields to the stdout. How does the conf file should be.

note: idea is later to use it to kibana for visualization.

Upvotes: 0

Views: 66

Answers (2)

hurb
hurb

Reputation: 2217

Example config:

input {   
    file     {
        path => ["/root/applog/scheduler.log"]
        codec => "json"
        start_position => "beginning" # If your file already exists
    }
}

filter { } # Add filters here (optional)

output {
    elasticsearch { } # pass the output to ES to prepare visualization with kibana
    stdout { codec => "rubydebug" } # If you want to see the result in stdout
} 

Upvotes: 2

Alain Collins
Alain Collins

Reputation: 16362

Logstash includes a json codec that will split your json into fields for you.

Upvotes: 0

Related Questions