Reputation: 59
I need to input a csv file to logstash , Filter out source IP field and convert it to geoip. My kibana message is like this now :
June 29th 2016, 12:22:07.194 message:"Jun 27, 2016, 10:56:17 PM",107.77.212.114,10.119.1.29,HTTP 200 - Ok,CTSUSCHDSXCM003,1 @version:1 @timestamp:June 29th 2016, 12:22:07.194 path:/mnt/shiny/ELT.csv host:ubuntuserver _id:AVWa7d0P6YdQaT-CDTqx _type:logs _index:elt1 _score:
Can someone help me with logstash config file to achieve the same.
I have tried:
input{
file{
path=> "/mnt/shiny/ELT.csv" start_position=>"beginning" }
}
filter{
csv{
columns => ["Start Time","Source IP","Destination IP","Event Name","Log Source","Event Count"] separator=>","
}
geoip {
source => "Source IP"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
output{
elasticsearch{
hosts=>"localhost:9200"
index=>"elt1"
}
}
Upvotes: 0
Views: 965
Reputation: 59
Baudsp,Will your responses are correct. The database is the issue i fixed by downloading a new one. Thanks for your help.
Upvotes: 0