Swaraj
Swaraj

Reputation: 109

Not able to map csv file from logstash to kibana in Window

I'm trying to feed data in csv files into elastic search using logstash. My logsatsh config file looks like this:

input {
file {
path => "D:\Log Anlyser\data\cars.csv"
start_position => "beginning"
sincedb_path => "NUL"   
}
}
filter {
csv {
separator => ","
columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]
}
mutate {convert => ["milage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_count", "integer"] }
mutate {convert => ["seat_count", "integer"] }  
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["cars-%{+YYYY.MM.dd}"]
}
}

while firing this command for logstash in window : logstash -f cars.conf i am getting this:-

Sending Logstash logs to D:/Log_Anlyser/logstash/logs which is now configured via log4j2.properties
[2019-02-26T12:05:51,690][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-02-26T12:05:51,721][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.6.1"}
[2019-02-26T12:05:57,133][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-02-26T12:05:57,510][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-02-26T12:05:57,664][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-02-26T12:05:57,711][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>5}
[2019-02-26T12:05:57,742][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-02-26T12:05:57,758][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-02-26T12:05:57,852][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-02-26T12:05:58,179][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x274079d5 run>"}
[2019-02-26T12:05:58,226][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-02-26T12:05:58,226][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-02-26T12:05:58,547][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Now While connecting to kibana(localhost:5601) i am not able to map the data. Getting this error:-

Unable to fetch mapping. Do you have indices matching the pattern? Can you please help.

See this image for error in kibana

Upvotes: 1

Views: 643

Answers (2)

Sanjiv
Sanjiv

Reputation: 1298

I got the problem. Mistake is very silly. Path of CSV file was wrong. Earlier path was path => "D:\Log Anlyser\data\cars.csv". Current Path is`

path => "D:/Log_Anlyser/data/cars.csv"

It will work

Upvotes: 3

There might be few reasons - maybe the data is not reaching ES at all. you can check that by verifying the index exists, by running

GET es-url:9200/_cat/indices/cars*

If an index exists then you should be able to create the index pattern in Kibana.

If the index is missing then either Logstash is not reading the input file, or elasticsearch is not reachable. need to check logstash logs, and make sure data reaches ES.

Upvotes: 0

Related Questions