user3378018
user3378018

Reputation: 21

execute php file 100000 times at a time in centos

I need to run one php file 100000 times at a time. for that i used a exec command in a php file (runmyfile.php) and called that file using putty. The runmyfile.php file is have the following code.

for($i=0;$i<100000; $i++){
exec('php -f /home/myserver/test/myfile.php > /dev/null &');
}

It execute myfile.php file 100000 times in parallel.

This myfile.php fetches rows from mysql database table and perform some calculations and insert this values to another table.

But when running 100000 times it hangs out the server. I'm using centos as server.

Some times I'm getting resource unavailable error too.

If I run it 1000 times it works ok.

when I checked the following ulimit -a

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 514889
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1000000
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 10240
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1024
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

and my mysql max_connection is 200000

Is there any settings that I need to change. So that I can execute my php file 100000 times properly.

Upvotes: 0

Views: 246

Answers (1)

Mikpa
Mikpa

Reputation: 1922

Maybe you need to redesign your application. If you have the need to process 2 billion records in a Mysql database at a daily basis, I would say that running 100000 scripts in parallel is not the best way.

This would mean that each script process 20000 records, if I understand you correctly. It is not possible to process more records in every script?

Have a look at Big Data

Upvotes: 1

Related Questions