Reputation: 113
Is there any option to disable Kafka headers being consumed from consumer. In my case I wrote a consumer to consume messages from a Kafka topic published by an upstream system. My processing doesn't require any information from headers and the published headers are heavy weight (bigger than the message itself in size). So my consumer is taking longer time than expected.
Any option that I can only consume message content leaving headers so that it saves time to transfer the headers over network and de-serialize them at consumer. Your help is appreciated.
Upvotes: 1
Views: 1544
Reputation: 191844
Every message is a Record with Headers (as of Kafka 0.11).
length: varint
attributes: int8
bit 0~7: unused
timestampDelta: varint
offsetDelta: varint
keyLength: varint
key: byte[]
valueLen: varint
value: byte[]
Headers => [Header]
Record Header
headerKeyLength: varint
headerKey: String
headerValueLength: varint
Value: byte[]
Even if you ignore deserializing them, they will still be sent over the wire as part of the Record's TCP packet body.
You could try using a Kafka 0.10.2 client version, for example, which might drop the header entirely, because they just weren't part of the API, but YMMV.
As mentioned in the comments, the most reliable way here would be to stop sending such heavy information in the upstream application. Or the middle-ground would be to compress, and/or binary encode that data.
Upvotes: 1