ren
ren

Reputation: 3993

avoiding multiple locks while protecting multiple resources

Suppose I need to read/write 1000 files from multiple threads (i.e. one file could be written from multiple threads). I don't want to have one lock to protect all files, since that would be slow. All I can think of is to have a List<object> to hold 1000 locks and do something like lock(list[i]) { write to i'th file}. Is that the right way to do that?

This would be the approach:

    static object _list_lock = new object();
    static List<object> locks = new List<object>();

    public static void Main(string[] args)
    {
        for(int i=0; i<1000; i++)
            locks.Add(new object());

        var tasks = new List<Task>();
        for(int i=0; i<15000; i++)
        {
            var t = Task.Run(() =>
                             {
                                 int file = (new Random()).Next(0, 1000);
                                 object l;
                                 lock(_list_lock)
                                 {
                                     l = locks[file];
                                 }
                                 lock(l)
                                 {                                         
                                     //write to file 'file' 
                                 }
                             });
            tasks.Add(t);
        }
        tasks.ForEach(f => f.Wait());
    }

Upvotes: 0

Views: 100

Answers (1)

WhoIsRich
WhoIsRich

Reputation: 4163

If you have a List<> of file paths, which is checked before reading or writing, I would think it only need to be as big as the number of threads running.

You just need to make sure that the class that adds and removes entries from the List is multi-thread safe so that two threads cannot add the same path at the same time.

Upvotes: 2

Related Questions