Skull
Skull

Reputation: 1262

Parsing json using logstash (ELK stack)

I have created a simple json like below

[
    {
        "Name": "vishnu",
        "ID": 1
    },
    {
        "Name": "vishnu",
        "ID": 1
    }
] 

I am holding this values in file named simple.txt . Then i used file beat to listen the file and send the new updates to port 5043,on other side i started the log-stash service which listen to this port in order to parse and pass the json to elastic search. log-stash is not processing the json values,it hangs in the middle.

logstash

input {
  beats {
    port => 5043
    host => "0.0.0.0"
    client_inactivity_timeout => 3600
  }
}
filter {
  json {
    source => "message"
  }
}
output {
    stdout { codec => rubydebug }
}

filebeat config:

filebeat.prospectors:

    - input_type: log
      paths:
        - filepath
    output.logstash:
      hosts: ["localhost:5043"]

Logstash output

**

Sending Logstash's logs to D:/elasticdb/logstash-5.6.3/logstash-5.6.3/logs which is now configured via log4j2.properties
[2017-10-31T19:01:17,574][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/elasticdb/logstash-5.6.3/logstash-5.6.3/modules/fb_apache/configuration"}
[2017-10-31T19:01:17,578][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/elasticdb/logstash-5.6.3/logstash-5.6.3/modules/netflow/configuration"}
[2017-10-31T19:01:18,301][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2017-10-31T19:01:18,388][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
[2017-10-31T19:01:18,573][INFO ][logstash.pipeline        ] Pipeline main started
[2017-10-31T19:01:18,591][INFO ][org.logstash.beats.Server] Starting server on port: 5043
[2017-10-31T19:01:18,697][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

**

Every time when i am running log-stash using command

logstash -f logstash.conf

And since there is no processing of json i am stopping that service by pressing ctrl + c .

Please help me in finding the solution.Thanks in advance.

Upvotes: 1

Views: 4041

Answers (1)

Skull
Skull

Reputation: 1262

finally i got ended up with config like this.It works for me.

input 
{
    file 
    {
       codec => multiline
        {
            pattern => '^\{'
            negate => true
            what => previous                
        }
        path => "D:\elasticdb\logstash-tutorial.log\Test.txt"
        start_position => "beginning"       
        sincedb_path => "D:\elasticdb\logstash-tutorial.log\null"
        exclude => "*.gz"
    }
}

filter {
json {
source => "message"
remove_field => ["path","@timestamp","@version","host","message"]
}
}

output {
  elasticsearch { hosts => ["localhost"] 
  index => "logs"
  "document_type" => "json_from_logstash_attempt3"
  }
  stdout{}
}

Json format:

{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}

Upvotes: 1

Related Questions