Bob
Bob

Reputation: 1396

Kafka: Consumer Crashing

I inherited some Kafka code that I'm implementing into another project and came across an issue... After the consumer receives 3995 messages from the producer, it crashes and gives the following error:

ERROR Error while accepting connection (kafka.network.Acceptor) 
java.io.IOException: Too many open files

Information about data being sent:
Very bursty around the time of crash
Always crashes at 3995

I am running it on a Centos Virtual Machine, I've ran other smaller data sets through it with ease. Thanks for your time!

Upvotes: 0

Views: 1664

Answers (1)

MrElephant
MrElephant

Reputation: 312

"Too many open files" can you type 'lsof | wc -l' in your linux to know how many files are opened.

Follow the guide to increase number files opened:

The Number Of Maximum Files Was Reached, How Do I Fix This Problem? Many application such as Oracle database or Apache web server needs this range quite higher. So you can increase the maximum number of open files by setting a new value in kernel variable /proc/sys/fs/file-max as follows (login as the root):

sysctl -w fs.file-max=100000

Upvotes: 2

Related Questions