何子洋
何子洋

Reputation: 57

How to use named destination for custom processor in spring cloud data flow?

I am learning SCDF these days...I hava some questions about named destination. I create a stream like ":test-topic > log". I can see the log sink consume data from topic "test-topic". But if I add a custom processor to SCDF.Then I create a stream like this:

:test-topic-source > etl-data-transform > :test-topic-sink

I think the "etl-data-transform" processor will consume data from topic "test-topic-source"(Kafka) and product data to "test-topic-sink", but the log is "Subscribed to topic(s): stringOperation-in-0" and "Using kafka topic for outbound: stringOperation-out-0" ("stringOperation" is my custom function)

Why SCDF do not use the topic "test-topic-source" and "test-topic-sink"? How to solve it?


I know I can use properties like this: spring.cloud.stream.function.bindings.stringOperation-in-0=in spring.cloud.stream.bindings.in.destination=test-topic-source but if I want to output to two topics?

Thanks!

Upvotes: 0

Views: 242

Answers (2)

何子洋
何子洋

Reputation: 57

Thx! That is my mistake. I add '@EnableBinding(Processor.class)' to my custom processor and then fix it.

Upvotes: 0

Ilayaperumal Gopinathan
Ilayaperumal Gopinathan

Reputation: 4179

Looks like your custom processor application uses the inbound and outbound names to be in and out. The SCDF expects these names to be input and output respectively. This is the reason you see the explicit binding names based on the names in and out. Please change the names to be input and output and I believe that should fix this issue.

Upvotes: 1

Related Questions