user3001408
user3001408

Reputation: 310

Web Server Handling Concurrent API calls

I am trying to understand web server and how it handles concurrent API calls.

Say I have an Apache based web server, and say at some instant 1000 people opened my web page at the same time. Now the web page has one API, and the API must run for all these 1000 people for the web page to render itself. The API takes about 10 seconds to complete.

Now my web server is, say quad core system. So it can run at max 4 processes concurrently. Does it mean that the web server first processes 4 people in first 10 seconds, then next 4 users in next 10 seconds, and so on!

But this does not make sense to me, big sites often have concurrent access of thousands of users, and it is hard for me to believe that their servers have thousands of cores to speed up the response.

I have tried to look up in Google, but after hours of searching, still am confused.

Can anyone kindly throw some light on this.

Upvotes: 0

Views: 210

Answers (1)

Tan Hong Tat
Tan Hong Tat

Reputation: 6864

This Multi-Processing Module (MPM) implements a hybrid multi-process multi-threaded server. By using threads to serve requests, it is able to serve a large number of requests with fewer system resources than a process-based server. However, it retains much of the stability of a process-based server by keeping multiple processes available, each with many threads.

Source: https://httpd.apache.org/docs/2.4/mod/worker.html

Source: https://httpd.apache.org/docs/2.4/mod/prefork.html

Upvotes: 1

Related Questions