Reputation: 1
This Fluentd service consumes from Kafka and stores data in OpenSearch.
The longest logs are about 32,700 bytes, while typical logs are around 10 to 15 MB.
<buffer>
chunk_limit_size 50m
queue_limit_length 256
flush_mode immediate
flush_thread_count 7
retry_max_times 3
retry_wait 10s
overflow_action throw_exception
</buffer>
2024-04-29 19:02:02 +0000 [warn]: #0 failed to write data into buffer by buffer overflow action=:throw_exception
2024-04-29 19:02:02 +0000 [warn]: #0 suppressed same stacktrace
2024-04-29 19:02:02 +0000 [warn]: #0 emit transaction failed: error_class=Fluent::Plugin::Buffer::BufferOverflowError error="buffer space has too many data" location="/usr/local/bundle/gems/fluentd-1.14.6/lib/fluent/plugin/buffer.rb:327:in `write'" tag="event-***"
Logs with the "event-**" tag are duplicated in OpenSearch.
Please add any other solutions or suggestion..
The retry_forever true option also results in duplicates in OpenSearch
Upvotes: 0
Views: 550