DonOso
DonOso

Reputation: 1

Multithreading issue on linux

I have a server: Centos 6.6 \ 32 GB RAM \ 500 Mbit

Task: I need to run multi thread php script that gets the content of different domains.

Problem: When I increase the amount of threads from 20 to 100+, each thread works much longer and sometime I even don't get a result from remote domain. The bandwidth is also very different, comparing to 20 threads. As a result - increasing the amount on threads don't give any plus, but lowers the overall quality of the results.

Debug: 10 threads - 1 thread works 20 seconds 20 threads - 1 thread works 21 seconds 50 threads - 1 thread works 25 seconds 100 threads - 1 thread works 45 - 100 seconds 150 threads - 1 thread works 45 - 150 seconds

Alternatives: I tried the same script on: - Different server with Centos 7 and another php version - On Ruby language - On pure bash: curl + GET commands I got totally the same!

Question: What should I increase in system setting to make multi threading work correct?

Thanks in advance!

Upvotes: 0

Views: 244

Answers (2)

DonOso
DonOso

Reputation: 1

Here is the simpliest code

Thread:

for i in {1..50}; do
curl -s --connect-timeout 3 -o /root/tmp/${1}_${i}.txt 
http://example.com/html;
done

Launcher:

for i in {1..50}; do
nohup ./m.sh ${i} > /dev/null &
done

Upvotes: 0

Jonathan Allon
Jonathan Allon

Reputation: 241

This is hard to answer without the code and the destination server you are polling, but i can assume that you are experiencing throttling by one of the many points you are going through when accessing the remote servers. either your ISP, the remote server or your own server provider is limiting the amount of connections you can send per a second, slowing down every connection you make. this is very common in DDOS protection software.

Upvotes: 2

Related Questions