Reputation: 6925
I have coded some library that persists some data (indexed by processID)
between multiple requests in same PHP-FPM process. Currently I have max_requests
set to 10000
. So, the data is shared between these 10000 requests, until the process dies.
When this data is unavailable (initially), one request that is being handled currently creates this data and subsequent requests use it.
Problem
I see that this data for the same process is created multiple times for same process. So, are multiple requests handled by same process concurrently? Or are the requests handled like a queue?
Upvotes: 0
Views: 801
Reputation: 48387
I have coded some library that persists some data (indexed by processID)
I'm really struggling to imagine what problem this is the right solution for. This sounds the XY problem.
are multiple requests handled by same process concurrently
No - a PHP-FPM process can only handle one request at a time. Usually a process can only handle one task at a time - an exception is with lightweight processes (threads) where the OS process entity has more than one schedulable execution. Event driven programs (e.g. javascript, nginx) can create the illusion of handling more than execution thread at a time, but in practice this is not the case - they are merely able to switch between different operations.
are the requests handled like a queue?
Yes, the listening socket is the head of the queue, and multiple worker processes will be assigned items (one at a time) from the queue.
Upvotes: 1