Hélène Martin
Hélène Martin

Reputation: 1441

Python too many open files, eventlet and multiprocessing.dummy

I have a Python 2.7 script running on Linux that crashes with IOError: [Errno 24] Too many open files. When I run lsof -p <script_pid> to see what files the script has open, I see an increasing number of anon_inode files.

This script first downloads files from S3 using eventlet for concurrency. It then processes the downloaded files using multiprocessing.dummy for multithreading. I have run the multithreaded code in isolation and found that it only leaks file descriptors when I include the following monkey patching for eventlet:

patcher.monkey_patch(thread=False)

Any ideas on how I could resolve this would be much appreciated!

Upvotes: 4

Views: 3068

Answers (1)

dawn360
dawn360

Reputation: 11

I also run into this issue, on Ubuntu at least the open files limit for normal users defaults to 4096. So, if you are going to have more than ~4000 simultaneous connections you need to bump this up.

Solution: Raise open file descriptor limits

Here's how to do it on ubuntu

add a line to end of /etc/security/limits.conf like

* soft nofile 16384    
* hard nofile 16384

The first column describes WHO the limit is to apply for. * is a wildcard, meaning all users. To raise the limits for root, you have to explicitly enter 'root' instead of '*'.

You also need to edit /etc/pam.d/common-session* and add the following line to the end:

session required pam_limits.so

logout and then back in before you can use the new max limit, test with

ulimit -n

https://askubuntu.com/questions/162229/how-do-i-increase-the-open-files-limit-for-a-non-root-user

Also here is a good article on the caveats in using eventlets https://code.mixpanel.com/2010/10/29/gevent-the-good-the-bad-the-ugly/

Upvotes: 1

Related Questions