Reputation: 121
I have created an index in elasticsearch to store some test logs, but I can't get the data to load from logstash. I create the index in ES:
put sonicwall
{
"mappings":{
"properties":{
"Time": { "type":"text"},
"Category": { "type":"text"},
"Group": { "type":"text"},
"Event":{ "type":"text"},
"Priority":{ "type":"text"}
}
}
}
Now, my logstash config file:
input{
file{
path => "C:/Elastic/logsPrueba/log3.csv"
start_position => beginning
}
}
filter{
csv{
separator => ","
columns => ["Time","Category","Group","Event","Priority"]
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "sonicwall"
}
stdout {}
}
This is the CSV
when I run logstash, the data is never loaded
Can someone help me?
Upvotes: 0
Views: 797
Reputation: 121
I think I have discovered the problem. When the csv file is read for the first time, some kind of record is saved at the end of the file and then logstash is not able to start from the beginning. I have tried to fix it with sincedb_path = "NULL" on Windows or sincedb_path="/dev/null" on Linux, but logstash is able to start reading the file from the beginning again and again. If I add a new log line and run logstash, only the newly added line is painted.
Upvotes: 0
Reputation: 3690
check the stats from logstash events to understand where is the problem.
curl -XGET 'localhost:9600/_node/stats/events?pretty'
Why are the logs not indexed in the elasticsearch logstash structure I designed?
Upvotes: 0