Reputation: 3165
I'm running nginx as frontend and php-fpm as backend to prcesss php files. I'm getting "Too many open files" error on my /var/log/php-fpm/error.log. I've increased hard & soft ulimit to 65535 and It seems can't solve the problem.
/var/log/php-fpm/error.log
[17-Sep-2012 14:43:51] ERROR: failed to prepare the stderr pipe: Too many open files (24)
[17-Sep-2012 14:43:52] ERROR: failed to prepare the stderr pipe: Too many open files (24)
ulimit -n
65535
/etc/php-fpm/www.conf
rlimit_files = 65535
Upvotes: 7
Views: 15416
Reputation: 1638
You could check with lsof (list open files), probably need to install it as most linux distro's don't include this by default. It should give you a view on which ones those open files are and the processes that own it (so you can grep for php). do a lsof | wc -l to count those. Are there really that much open ? Or perhaps the error is more like a symptom of something deeper. If you know what they are, perhaps that will give a clue to the why...
Looking at the fact that it mentions STDERR it's probably error related in the broad sense (error configuration parameters for php-fpm perhaps?)
Also, check to see if you are using a unix socket or a tcp one, try the tcp option unless you are on BSD where the unix sockets are reputed to work faster.
Upvotes: 4