Reputation: 57
So, basically i have a single log or a text file that i want to visualize using ELK. I am able to setup elasticsearch and kibana on the system. This is my logstash config file right now.
input { file {
path => "G:/everything.log"
start_position => "beginning"
} }
filter {
}
output {
elasticsearch { hosts => ["localhost:9200"]
index => "example" }
stdout { codec => rubydebug }
}
when in browser I open http://localhost:9200/ this is what appears
{
"name" : "1rtH6q6",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "oicreqLyQ_iNiTrOQ0sYPQ",
"version" : {
"number" : "5.4.1",
"build_hash" : "2cfe0df",
"build_date" : "2017-05-29T16:05:51.443Z",
"build_snapshot" : false,
"lucene_version" : "6.5.1"
},
"tagline" : "You Know, for Search"
}
in my kibana console after executing GET /_cat/indices?v&pretty
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open .kibana fxFjIS55Q9-qAgqLlPE0Cw 1 1 2 0 6.3kb 6.3kb
yellow open windows_events YUBWMzRpRTmEdu6E2UoCXg 5 1 2 0 12.6kb 12.6kb
Please help me solve this problem
Upvotes: 1
Views: 3329
Reputation: 7776
As far as I understood your question, you need to do couple of things in order to display your logs in Kibana.
1) Based on your log pattern, you have to write appropriate grok pattern to parse your log file. You can also use Grok Debugger utility to write grok pattern for your log.
E.g. For Apache access log grok pattern would be
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
I would recommend you to read this official guideline.
Once your log file parse correctly then you will see the data will index on your example
Elasticsearch index.
For validate the data you can use below get command
curl -XGET 'localhost:9200/example/_search'
2) In the next step, you have to configure default Elasticsearch index pattern in kibana. For reference read these reference
Upvotes: 3