Reputation: 3739
Using Spring Cloud DataFlow version 1.2.2 with following configuration:
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.type=kafka
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.brokers=<MY_BROKER>
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.zkNodes=<MY_ZK>
I'm trying to create a stream that will read from a specific topic and flush it into the long sink, as follow:
stream create --name metricsStream --definition ":metrics --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' > bridge | log" --deploy
Looking on the log file I can see the following error:
2017-07-17 09:44:01,700 INFO -kafka-listener-1 log-sink:202 - [B@79d0a6b6 2017-07-17 09:44:01,700 ERROR -kafka-listener-1 o.s.c.s.b.k.KafkaMessageChannelBinder:283 - Could not convert message: 7B226D657472696354696D657374616D70223A313530303233373037302C226D65747269634E616D65223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F313136222C224074696D657374616D70223A22323031372D30372D31365432303A33313A32352E3438325A222C22706F7274223A33363133302C226D657472696356616C7565223A302C224076657273696F6E223A2231222C22686F7374223A223137322E32362E312E313135222C226D657373616765223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F31313620302031353030323337303730227D java.lang.StringIndexOutOfBoundsException: String index out of range: 380
I was also trying to configure some properties for the consumer/producer of the kafka source
stream create --name metricsStream --definition ":metrics --spring.kafka.consumer.valueDerserializer=org.apache.kafka.common.serialization.StringDeserializer --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' --spring.cloud.stream.bindings.input.consumer.headerMode=raw --spring.cloud.stream.bindings.output.producer.headerMode=raw --outputType='text/plain;charset=UTF-8' > bridge | log" --deploy
But I get the same result
Here is the consumer details, as printed by Spring DataFlow:
2017-07-17 09:43:57,267 INFO main o.a.k.c.c.ConsumerConfig:180 - ConsumerConfig values: auto.commit.interval.ms = 100 auto.offset.reset = earliest bootstrap.servers = [172.26.1.63:9092] check.crcs = true client.id = consumer-2 connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes
= 1 group.id = metrics_KafkaToHdfs_5 heartbeat.interval.ms = 3000 interceptor.classes = null key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter
= 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
I saw similar quetion, but with no valid answer what is the property to accept binary json message in spring-cloud-stream kafka binder
My Kafka's metrics topic contains JSON lines. How should I configure Spring DataFlow to be able to read from the Kafka's topic in JSON format (or at least String format that looks like JSON ?
Upvotes: 0
Views: 1340
Reputation: 24472
Have you tried configuring the input content-type?
spring.cloud.stream.bindings.input.content-type=application/json
Or maybe with the prefix from Spring Cloud Dataflow:
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.bindings.input.content-type=application/json
Upvotes: 3