Lost
Lost

Reputation: 13565

IIS and single-threaded application

I am wondering about the request/response model in a single-threaded .NET application hosted in IIS. For example, if it is a WebAPI application which is single-threaded to read a large file, process it(do some string manipulation on the contents) and return it processed-contents in API response and for sake of argument let's assume that it takes 10 minutes to process the file.

Now, I read that ASP.NET IIS mapping is multi-threaded here:

How is ASP.NET multithreaded?

So what happens if I make 20 requests to the same API within a minute?

  1. What would happen to caller #2 and so on?? Wait for 10 minutes before IIS picks it up?
  2. Where does IIS track incoming requests?

Upvotes: 1

Views: 1303

Answers (3)

fenixil
fenixil

Reputation: 2124

Here is an good article which describes all layers that HTTP request passed before it comes to your code :) Each layer has its own configuration and limits.

Please note that .net thread pool grows rather slowly, it adds 2 threads a second. If you will have synchronous blocking code in your controllers (use lock or mutex) and push 100 requests to the server, your code will handle last request only in 45 seconds: 10 requests will be handled immediately (considering it a default thread pool size), then thread pool will grow (slowly) and will process all remaining 90 requests in 45 seconds (adding 2 threads to the pool).

Upvotes: 0

Artyom Ignatovich
Artyom Ignatovich

Reputation: 601

Without some tweaks IIS will process requests concurrently as any other server would do.

  1. If request queue size, request timeout and thread pool size would be set to appropriate values - the request #2 would be processes after spending 10 minutes in IIS request queue.

  2. No persistence. Only RAM.

It is kind of unusual scenario - to wait for such long requests to complete and then return a response in a while. From system design perspective is makes sense to queue up this calculation within a background job manager or cache the results when possible, returning data immediately rather than waiting for many minutes.

A bit another option which comes to my mind looks as follows:

  1. Setup a background job for processing files one-by-one.
  2. After POSTing a file to the server return a token representing its identity.
  3. Create a separate endpoint to consume file identity. Server should return either processed result or message indicating the file is in the queue and is waiting to be processed.

Upvotes: 1

driis
driis

Reputation: 164281

By default IIS processes requests concurrently. If you don't do anything actively to prevent it, you would start processing the same file on 2 threads concurrently, if 2 requests arrive at the same time.

You can use one of the synchronization primitives such as lock(...) or a mutex if concurrent processing is not desired. However; if it takes 10 minutes to process, are you sure you want to do it as part of a web request ? There could be better alternativees.

Upvotes: 1

Related Questions