Reputation: 2028
I was trying to outsource my MySql
db changes to kafka topic via source-connector
and that works. Now I want to send those data to elastic search instance.
To do that, I was following this Kafka Connect Elasticsearch: Consuming and Indexing with Kafka Connect and this one Kafka Connect and Elasticsearch.
For CDC of mysql to kafka, I can see the changes I made in mysql and read it creating a source-connector, but when I create another connector elasticsearch-sink
connector, source-connectr
task.state shown Failed
! And hence database changes are not going into ES though index is created there as setup in es-config.properties file.
I have put the jar/s inside kafka-dir where for source-connector's lib worked(to avoid further problem regarding classpath problem).
When creating elaticsearch-sink-connector
I get this error(though for soure, I have no error, and all libs are in same directory!):
ERROR Plugin class loader for connector: 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' was not found. Returning: org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader@5cc126dc (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:165)
I am running my connector by this:
bin/connect-standalone.sh config/connect-standalone.properties config/elasticsearch-connect.properties
In brief, only one of my connectors task.state remains RUNNING
at a time
Edit: plugin.path
for connect-standablone.properties file:
plugin.path=/media/***/projects/playground/kafka/kafka_2.12-2.4.0, /media/***/projects/playground/kafka/kafka-connect-elasticsearch/target/kafka-connect-elasticsearch-3.2.0-SNAPSHOT-package/share/java
both of them contains the es-connector jar.Last one added later as but still same
What should I do now?
Upvotes: 0
Views: 1070
Reputation: 2028
The thing work like charm when I just change schema.enable to false
key.converter.schemas.enable=false
value.converter.schemas.enable=false
and added extra /
after the plugin.path
though without /
it worked for source-connector!
Edit: I forgot to mention that I have replaced my connector version with 5.4.0 release too as cricket_007 mentioned
Edit-2:
I later investigated more and found that, extra /
issue along with the new key properties mentioned below helped me to get rid of FAILED
state of connectors(only one connector was RUNNING at a time):
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
in connect-standalone.properties file
Thanks
Upvotes: 2