Derlin
Derlin

Reputation: 9891

kafka streams - rich mappers

question: I have records in a kafka topic that need to be augmented with metadata from a MySQL database (among others). Using Flink, it is possible to implement rich mappers, so a connection can be reused for multiple records.

Is there a similar functionality in kafka streams (java) ?


random thoughts: so far, I found the following variants:

Something I missed ?

Note: I also thought about kafka-connect, but I need to transform data between two kafka topics, not between external systems...

Upvotes: 2

Views: 285

Answers (2)

miguno
miguno

Reputation: 15057

Exactly what Matthias J. Sax said in his answer: Processor and Transformer can be stateless or stateful.

For reference, let me also point you to the following snippets in Confluent's documentation of the Kafka Streams API (intro at http://docs.confluent.io/3.2.0/streams/developer-guide.html#processor-api)

The Processor API can be used to implement both stateless as well as stateful operations, where the latter is achieved through the use of state stores.

There's also a demo application that implements a stateless Transformer: https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/test/java/io/confluent/examples/streams/MixAndMatchLambdaIntegrationTest.java

The example above (branch 3.2.x of confluentinc/examples) is for Confluent 3.2.0 with Apache Kafka 0.10.2.0.

Upvotes: 1

Matthias J. Sax
Matthias J. Sax

Reputation: 62310

You can use transfrom()/process() also without a state -- the state is optional. Thus, this should give you what you need.

Upvotes: 0

Related Questions