Reputation: 719
Whats is considered best-practise when creating topics for Apache Kafka?
Does everyone allow automatic creation of topics or how do you do it? Do you bundle the topic-creation-step with the starting of the kafka-instance?
I have a docker-based Kafka-installation which is gone be used by multiple applications. How can I keep the topic-creation for every application separate from the startup of the Kafka-container?. Looking at Confluents music-demo they create the topics by spinning up a new kafka-image, calling the "create-topic-script" and then leave the container to die. This feels abit "hacky" but maby its the only way?
Regards
Upvotes: 22
Views: 37322
Reputation: 38193
Whats is considered best-practise when creating topics for Apache Kafka? Does everyone allow automatic creation of topics or how do you do it?
It depends on what you're doing. You can definitely use topic auto creation, but then the automatically created topics will have the default broker-wide configuration in terms of partitions and replication factor.
For Kafka Streams, a Confluent engineer writes that manually creating topics before starting the application is recommended:
I also want to point out, that it is highly recommended to not use auto topic create for Streams, but to manually create all input/output topics before you start your Streams application.
For more details, see http://docs.confluent.io/current/streams/developer-guide.html#managing-topics-of-a-kafka-streams-application
With regard to:
Do you bundle the topic-creation-step with the starting of the kafka-instance?
Yes. If you have a Java application, you can use AdminClient
in the main
method of your application before you start your application. If you have some other kind of application, you can run an init script that calls bin/kafka-topics.sh
before your application. If you're using Kubernetes, you can use a Kubernetes Init Container. But there are obviously any number of ways you can do this.
This feels abit "hacky" but maby its the only way?
I don't think this is hacky. Pretty normal to have init steps, I think.
Finally, also note that you may want to configure the retention policy on your topics. This can be done with broker-wide defaults or on a per-topic basis: https://stackoverflow.com/a/48504305/741970.
Thanks to Peter S. for pointing out that the officially recommended way to create topics is in the CI pipeline:
The recommended approach is to create topics through a continuous integration pipeline, where topics are defined in source control and created through a build process. This ensures scripts can validate that all topic names conform to the desired conventions before getting created. A helpful tool to manage topics within a Kafka cluster is kafka-dsf.
Upvotes: 23
Reputation: 42240
Another option to manage topics is to take a declarative, git-ops approach. This separates topic management from the runtime of applications completely. Whether this constitutes "best practice," or not, I think depends on the situation. For some use-cases/teams this can work very well.
See the following question/answer for tooling supporting this approach.
How to declaratively manage Kafka topics?
Upvotes: 2
Reputation: 1196
You have two ways to create a Kafka topic, each one depends on your needs :
auto.create.topics.enable
to true (it should be by default), and then the topic will be created when a value is published to the broker. Then be sure to check the following properties : default.replication.factor
for the default number of replicas of the created topic, and num.partitions
for the default number of partitionsThis is the official way to manually create a topic :
bin/kafka-topics.sh --create --zookeeper <your_zookeeper_host>:2181 --replication-factor <number_of_replicas> --partitions <number_of_partitions> --topic <name_of_your_topic>
Upvotes: 2