Ravi Chandra
Ravi Chandra

Reputation: 687

How to control multiple connections in ASP.NET web page

I have web page index.aspx and corresponding server side code index.aspx.cs. This C# code has a method which cannot executed in parallel if multiple clients connect to my website. How can I restrict this?

Here is what the method does. It creates a folder, zip it and make it available for the user to download. My requirement is that when one user is executing this method, some other user should not do this because it will create the same folder again which leads to corruption of data.

I tried using Session objects. But I came to know that session objects are stored per client basis.

Can anyone suggest me some solution?

Upvotes: 1

Views: 1339

Answers (3)

Aristos
Aristos

Reputation: 66641

If you use the classic asp.net session you do not need to do anything because session all ready lock the run of the pages from multiple users.

If you not, then you can follow what Marc suggest, use Mutex.

About the session lock:
Web app blocked while processing another web app on sharing same session
jQuery Ajax calls to web service seem to be synchronous
ASP.NET Server does not process pages asynchronously
Replacing ASP.Net's session entirely

Upvotes: 0

Kai
Kai

Reputation: 2013

The Application context or a static class is application wide. So you can store a flag which indicates that the process is already started. After the procees ended, you can delete the flag.

http://msdn.microsoft.com/en-us/library/94xkskdf(v=vs.100).aspx

And always use Application.Lock when you write to the application state and lock(mutex) when you use a static class.

In your case a static class would be a better solution, because it seems that the application context exist only for compatible purposes to classic asp: Using static variables instead of Application state in ASP.NET

static object mutex= new object();

lock(mutex)
{
   //Do the work
}

Upvotes: 1

Marc Gravell
Marc Gravell

Reputation: 1062745

My immediate advice would be: create a random folder name per request, which would allow you to run them concurrently. However, if that isn't an option then you will need to synchronize using something like lock or Mutex. However, this would only work well if you are returning the result from the current request, rather than zipping it in one request, and letting them download it the next.

Frankly, though, I think that you should do the zip in the request for the zip. Indeed, unless the file will be huge you don't even need to touch the file-system - you can create a zip in-memory using MemoryStream and any of the zip encoders (System.IO.Packaging.ZipPackage for example) - then just hand the client the data from the MemoryStream.

If you are using MVC, this is just return File(contents, contentType). With vanilla ASP.NET you need a few more steps.

Upvotes: 1

Related Questions