Reputation: 5177
The code for my Kafka consumer looks like this
def read_messages_from_kafka():
topic = 'my-topic'
consumer = KafkaConsumer(
bootstrap_servers=['my-host1', 'my-host2'],
client_id='my-client',
group_id='my-group',
auto_offset_reset='earliest',
enable_auto_commit=False,
api_version=(0, 8, 2)
)
consumer.assign([TopicPartition(topic, 0), TopicPartition(topic, 1)])
messages = consumer.poll(timeout_ms=kafka_config.poll_timeout_ms, max_records=kafka_config.poll_max_records)
for partition in messages.values():
for message in partition:
log.info("read {}".format(message))
if messages:
consumer.commit()
next_offset0, next_offset1 = consumer.position(TopicPartition(topic, 0)), consumer.position(TopicPartition(topic, 1))
log.info("next offset0={} and offset1={}".format(next_offset0, next_offset1))
while True:
read_messages_from_kafka()
sleep(kafka_config.poll_sleep_ms / 1000.0)
I have realised that this setup of consumer is not able to read all the messages. And I am not able to reproduce this as it's intermittent issue.
When I compare last 100 messages using kafka-cat
to this consumer, I found that my consumer intermittently misses few messages randomly. What's wrong with my consumer?
kafkacat -C -b my-host1 -X broker.version.fallback=0.8.2.1 -t my-topic -o -100
There are just too many ways to consume messages in python. There should be one and preferably only one obvious way to do it.
Upvotes: 1
Views: 4185
Reputation: 11208
There is a problem of missing messages in your Kafka client. I found solution here:
while True:
raw_messages = consumer.poll(timeout_ms=1000, max_records=5000)
for topic_partition, messages in raw_messages.items():
application_message = json.loads(message.value.decode())
Also there is another Kafka client exists: confluent_kafka. It has no such problem.
Upvotes: 1