Siddharth Sharma
Siddharth Sharma

Reputation: 11

How to increase default max memory of message in Confluent Cloud?

I am new to Kafka. I have created a cluster and a topic in a Cloud Confluent Kafka (Basic plan). the default max limit of message is 8MB as per the documentation(https://docs.confluent.io/cloud/current/clusters/broker-config.html#custom-topic-settings-for-all-cluster-types) of Confluent kafka. I want to store big files in Kafka i.e. ZIP files (300MB+). How can I confiure the same in cloud if I want to store large files in Kafka?

Question(Out of curosity): Can we even store large ZIP files in a Kafka?

I tried updating the values in topic configuration settings(https://prnt.sc/b8Hy8EZ-kagB) but seems like there is no way to set limit more than 8MB

Upvotes: 1

Views: 459

Answers (1)

Sivaram Rasathurai
Sivaram Rasathurai

Reputation: 6343

It is generally discouraged to use Kafka to store large files, as Kafka is optimized for managing large quantities of small messages, rather than a limited number of large messages.

It is advisable to use a specialized file storage system, such as Amazon S3 or Google Cloud Storage, to store and serve large files instead of Kafka. These systems are designed to handle large files effectively and can be connected to Kafka to trigger events or notifications when files are uploaded or updated.

The better way is to use cloud storage to store the data and pass the metadata by Kafka messages. Here is an example from conduktor https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/

Upvotes: 1

Related Questions