Reputation: 431
I'm following this tutorial: https://docs.confluent.io/4.0.0/installation/docker/docs/tutorials/connect-avro-jdbc.html
and put my code here: https://github.com/lexsteens/kafka-connect-tutorial
I have a problem when I'm registering the JDBC connector: it gets registered, but when looking its status in http://localhost:28083/connectors/quickstart-jdbc-source/status , I get a status FAILED, and the following stack trace:
org.apache.kafka.connect.errors.DataException: quickstart-jdbc-test
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:77)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:220)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:187)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
at sun.security.ssl.InputRecord.read(InputRecord.java:527)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1283)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1258)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:250)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:161)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:218)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:299)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:294)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:61)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:100)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
at io.confluent.connect.avro.AvroConverter.serialize(AvroConverter.java:110)
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:220)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:187)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
However, I didn't request SSL encryption (see my docker-compose...).
I tried to google this issue or find it here, but had no success.
Upvotes: 2
Views: 2592
Reputation: 191874
For starters, remove localhost
entirely from your Compose file. I'm surprised services will start correctly. Review Networking in Compose and learn that setting the hostname
property is not required and you can link service using just the "service name". For example, kafka would connect to zookeeper:32181
, not localhost, and you Connector JSON should connect to mysql:3306
address, not localhost:3306
, etc.
You can look at my compose file for reference, which is partitially taken from Confluent's Docker Compose.
While I've only tested with S3 Connect so far, your problem is coming from the AvroConverter
, which is shared over all those projects, and I can verify that it did work for writing Avro messages to S3.
Specifically regarding the errors, they are related to the Schema Registry, and your Connect service has an HTTPS address to it, but doesn't look like you set up SSL for that container
Btw, my suggestion would be to try at least 4.1.2
images rather than 4.0.0 to get the latest patches. if not, the latest is currently 5.0.0
Upvotes: 2