Reputation: 455
I would like to know, if a large number of concurrent http requests are on one php script in one server, will the request become slow DUE TO the single php script?
If I make the same script more than one copies
e.g.
script1.php, script2.php, script3.php
All these 3 scripts have the same content. Then, will the php page access speed be higher if the client requests randomly choose any one of page to request?
This is just my idea to reduce the load and request-to-response time. I would like to know if my idea makes sense.
Upvotes: 1
Views: 94
Reputation: 132494
If anything, having many requests for the same file should increase the speed of each requests due to caching, even if it's only at the hardware level. Definitely don't make copies of the file.
Upvotes: 2
Reputation: 212522
No it won't. It's the webserver that processes the script, so any slowdown is in the webserver, not in the number of copies of the PHP script file.
That's not to say that a badly written script won't add a lot of overhead.
But duplicating your script across several files is not a good idea... if you ever need to change the code, you need to make the change several times to each file.
EDIT
You can actually test how well things will be processed when you have a large number of users by using tools such as ApacheBench to see just how much of a problem this will be.
Upvotes: 1
Reputation: 157989
I would like to know, if a large number of concurrent http requests are on one php script in one server, will the request become slow DUE TO the single php script?
No.
This is just my idea to reduce the load and request-to-response time. I would like to know if my idea makes sense.
No.
Upvotes: 5