Andy
Andy

Reputation: 105

.net file access synchronisation

I'm trying to synchronise a webapi that's going to serve images. When I request a specific image, it will check whether the image already exists and return it if it does, or if it doesn't exist it will create it and then return it.

My problem is that it's obviously not thread safe just like that; I have one thread come in and determine the image doesn't exist and starting to create it, while another request comes in and also determines the image doesn't exist (just yet) and tries to create it as well. I know I could lock the whole thing to avoid the issue, but I'm trying to avoid that. There will be 100,000s of images, and I don't see why I need to stop all threads from reading the other images just because one image doesn't exist yet. Is there a "usual" way of doing this? Images are requested by ids, can I lock on the id of a particular image? For example

List<long> _locks = new List<long>();
_locks.Add(17);
lock(_locks[0]){...}

It just doesn't look right... surely there's a better solution?

Upvotes: 1

Views: 97

Answers (2)

T.S.
T.S.

Reputation: 19384

My friend, you want to avoid "try-based" approach, you want to avoid "locking" approach and yet, you accessing a shared resource. Try to call corflags on any managed dll and leave the command window up. Then go to windows explorer and try to delete that file. You will get "the file is in use". Even Windows is not totally out of the woods with files, with all its caching mechanisms. Of course, the problem is that you have all files go through same lock. May be you can do something about that will minimize that condition. Lets see. This is totally untested idea:

Create custom lock object

public class CustomLock
{
    private object _lock = new object();
    private string _file;

    public CustomLock(string file)
    {_file = file;}

    public object Read()
    {
        lock(_lock)
        {
            // read your file
        }
    }

    public object Read()
    {
        lock(_lock)
        {
            // read your file
        }
    }

    public override int GetHashCode()
    {
        // here your hash algorithm
    }

}

Now, use HashTable with HashTable.Synchronized as described here http://msdn.microsoft.com/en-us/library/system.collections.hashtable.synchronized%28v=vs.110%29.aspx, to have a cache of your locks

Or you can synch the Dictionary What's the best way of implementing a thread-safe Dictionary?. So, now it is going to be something like this:

 var custLock = GetCustLock(file); // This should always return CustomLock object, new or existing
 object file = custLock.Read();

Presumably, using this design, you will remove bottleneck of locking all files in same place. Please Test it.

Upvotes: 0

Henk Holterman
Henk Holterman

Reputation: 273591

It depends on your server setup (single or farm). On a single server you can use a shared HashSet. The filename is an excellent key. You only need a short lock around the Set, that won't hurt performance.

Here is a question about a concurrent HashSet

Upvotes: 2

Related Questions