Reputation: 17298
I am trying to parse json file into elasticsearch by using logstash but I couldn't , I guess that I need to write some grok pattern. But I couldn't. How can I send below json into elasticsearch by using logstash.
{"machinename":"test1",
"longdate":"2019-01-29 13:19:32",
"level":"Error",
"mysite":"test1",
"message":"test2",
"exception":"test3",
"timestamp":"2019-01-29T13:19:32.257Z" }
My logstash file:
input {
file {
path => ["P:/logs/*.txt"]
start_position => "beginning"
discover_interval => 10
stat_interval => 10
sincedb_write_interval => 10
close_older => 10
codec => multiline {
negate => true
what => "previous"
}
}
}
filter {
date {
match => ["TimeStamp", "ISO8601"]
}
json{
source => "request"
target => "parsedJson"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "http://localhost:9200" ]
index => "log-%{+YYYY.MM}"
}
}
ERROR:
[2019-01-29T14:30:54,907][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-01-29T14:30:56,929][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2019-01-29T14:30:59,167][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 12, column 18 (byte 281) after input {\n file {\n\t path => [\"P:/logs/*.txt\"]\n\t\tstart_position => \"beginning\" \n\t\tdiscover_interval => 10\n\t\tstat_interval => 10\n\t\tsincedb_write_interval => 10\n\t\tclose_older => 10\n codec => multiline { \n\t\tpattern => \"^%{TIMESTAMP_ISO8601}\\"\n\t\tnegate => true\n what => \"", :backtrace=>["P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:50:in
compile_graph'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:in
map'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:49:in
initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in
execute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
[2019-01-29T14:31:00,417][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-29T14:34:23,554][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-01-29T14:34:24,554][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2019-01-29T14:34:27,486][ERROR][logstash.codecs.multiline] Missing a required setting for the multiline codec plugin:
codec {
multiline {
pattern => # SETTING MISSING
...
}
}
[2019-01-29T14:34:27,502][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["P:/elk/logstash/logstash-core/lib/logstash/config/mixin.rb:89:in config_init'", "P:/elk/logstash/logstash-core/lib/logstash/codecs/base.rb:19:in
initialize'", "P:/elk/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:97:in plugin'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:110:in
plugin'", "(eval):8:in <eval>'", "org/jruby/RubyKernel.java:994:in
eval'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:82:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in
initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in execute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in
block in converge_state'"]}
[2019-01-29T14:34:27,971][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Upvotes: 1
Views: 4878
Reputation:
You can try use the json filter plugin for logstash.
This way the filter plugin in logstash will parse the json:
filter {
json {
source => "message"
}
}
Another thing good to have is the tag_on_failure. this way, if the json it's not valid or misunderstood, you will see the message in elasticsearch/kibana, but with the _jsonparsefailure tag.
filter {
json {
source => "message"
tag_on_failure => [ "_jsonparsefailure" ]
}
}
Upvotes: 2