Reputation: 677
I'm trying to import a csv file to create data on my elasticsearch server in order to test it.
but I'm blocked to importing data using config file
this is a command (on winodws) logstash -f file.config
this is my config file
input{
file {
path => "/E:/Formation/kibana/data/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["maker","model","mileage","manufacture_year","engine_displacement",
"engine_power","body_type","color_slug","stk_year","transimission","door_count",
"seat_count","fuel_type","date_created","date_last_seen","price_eur"]
}
mutate {
convert => ["mileage","integer"]
convert => ["price_eur","float"]
convert => ["engine_power","integer"]
convert => ["door_count","integer"]
convert => ["seat_count","integer"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type=> "sold_cars"
}
stdout { }
}
and this is the error UPDATE this is log after using mode --debug thanks for helping
16:49:29.252 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:34.257 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:39.257 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:43.663 [[main]<file] DEBUG logstash.inputs.file - _globbed_files: /e/Formation/kibana/data/cars.csv: glob is: []
Upvotes: 0
Views: 1492
Reputation: 43
On Windows, you should use sincedb_path => "nul"
instead of sincedb_path => "/dev/null"
, which is used on Linux-based operating systems.
Upvotes: 1