Reputation: 39
I am trying to send data to ES from a MySQL server using logstash. I am using ES 6.3.0 and Logstash 6.3.0. My config file looks as follows:
input {
jdbc {
jdbc_driver_library => "/Users/.../mysql-connector-java-5.1.46-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://..."
jdbc_user => "user"
jdbc_password => "****"
statement => "SELECT * FROM user.customer"
}
}
output {
#stdout { codec => json_lines }
elasticsearch {
hosts => "localhost"
index => "customers"
}
}
I am using a MySQL database with one table which has several columns of different data types. When I try to send data to ES I get the following error message:
[2018-07-03T14:39:06,088][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"customers", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2ff3608c>], :response=>{"index"=>{"_index"=>"customers", "_type"=>"doc", "_id"=>"p-MnYGQBzIWWUpovTpES", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [uc_score] cannot be changed from type [long] to [float]"}}}}
After doing some research I came across that you cannot use different types in ES any longer, I am not however sure exactly what they mean. What can I do to pass this problem? What is a smart way of sending data from a MySQL database to ES? I am planning on using ES to analyze and visualize the data in the database.
Upvotes: 1
Views: 508
Reputation: 56
This is due to the limited ability to convert types using the JDBC driver.
Field uc_score
has MySQL type long
thas is not automatically converted by JDBC in elasticsearch type float
. So you should add this fragment to the logstash pipeline configuration in filter
section to convert uc_score
field:
filter {
mutate { convert => {"uc_score" => "float"} }
}
Do not forget to restart your logstash instance!
Source: Importing Data from MySQL to Elasticsearch to Visualize it with Kibana
Upvotes: 1