Matt Burland
Matt Burland

Reputation: 45145

Correct way to implement a resource pool

I'm trying to implement something that manages a pool of resources such that the calling code can request an object and will be given one from the pool if it's available, or else it will be made to wait. I'm having trouble getting the synchronization to work correctly however. What I have in my pool class is something like this (where autoEvent is an AutoResetEvent initially set as signaled:

public Foo GetFooFromPool()
{
    autoEvent.WaitOne();
    var foo = Pool.FirstOrDefault(p => !p.InUse);
    if (foo != null)
    {
        foo.InUse = true;
        autoEvent.Set();
        return foo;
    }
    else if (Pool.Count < Capacity)
    {
        System.Diagnostics.Debug.WriteLine("count {0}\t capacity {1}", Pool.Count, Capacity);
        foo = new Foo() { InUse = true };
        Pool.Add(foo);
        autoEvent.Set();
        return foo;
    }
    else
    {
        return GetFooFromPool();
    }
}

public void ReleaseFoo(Foo p)
{
    p.InUse = false;
    autoEvent.Set();
}

The idea is when you call GetFooFromPool, you wait until signaled, then you try and find an existing Foo that is not in use. If you find one, we set it to InUse and then fire a signal so other threads can proceed. If we don't find one, we check to see if the the pool is full. If not, we create a new Foo, add it to the pool and signal again. If neither of those conditions are satisfied, we are made to wait again by calling GetFooFromPool again.

Now in ReleaseFoo we just set InUse back to false, and signal the next thread waiting in GetFooFromPool (if any) to try and get a Foo.

The problem seems to be in my managing the size of the pool. With a capacity of 5, I'm ending up with 6 Foos. I can see in my debug line count 0 appear a couple of times and count 1 might appear a couple of times also. So clearly I have multiple threads getting into the block when, as far as I can see, they shouldn't be able to.

What am I doing wrong here?

Edit: A double check lock like this:

else if (Pool.Count < Capacity)
{
    lock(locker)
    {
        if (Pool.Count < Capacity)
        {
            System.Diagnostics.Debug.WriteLine("count {0}\t capacity {1}", Pool.Count, Capacity);
            foo = new Foo() { InUse = true };
            Pool.Add(foo);
            autoEvent.Set();
            return foo;
        }
    }
} 

Does seem to fix the problem, but I'm not sure it's the most elegant way to do it.

Upvotes: 2

Views: 2387

Answers (2)

Alex
Alex

Reputation: 13224

As was already mentioned in the comments, a counting semaphore is your friend. Combine this with a concurrent stack and you have got a nice simple, thread safe implementation, where you can still lazily allocate your pool items.

The bare-bones implementation below provides an example of this approach. Note that another advantage here is that you do not need to "contaminate" your pool items with an InUse member as a flag to track stuff.

Note that as a micro-optimization, a stack is preferred over a queue in this case, because it will provide the most recently returned instance from the pool, that may still be in e.g. L1 cache.

public class GenericConcurrentPool<T> : IDisposable where T : class
{
    private readonly SemaphoreSlim _sem;
    private readonly ConcurrentStack<T> _itemsStack;
    private readonly Action<T> _onDisposeItem;
    private readonly Func<T> _factory;

    public GenericConcurrentPool(int capacity, Func<T> factory, Action<T> onDisposeItem = null)
    {
        _itemsStack = new ConcurrentStack<T>(new T[capacity]);
        _factory = factory;
        _onDisposeItem = onDisposeItem;
        _sem = new SemaphoreSlim(capacity);
    }

    public async Task<T> CheckOutAsync()
    {
        await _sem.WaitAsync();
        return Pop();
    }

    public T CheckOut()
    {
        _sem.Wait();
        return Pop();
    }

    public void CheckIn(T item)
    {
        Push(item);
        _sem.Release();
    }

    public void Dispose()
    {
        _sem.Dispose();
        if (_onDisposeItem != null)
        {
            T item;
            while (_itemsStack.TryPop(out item))
            {
                if (item != null)
                    _onDisposeItem(item);
            }
        }
    }

    private T Pop()
    {
        T item;
        var result = _itemsStack.TryPop(out item);
        Debug.Assert(result);
        return item ?? _factory();
    }

    private void Push(T item)
    {
        Debug.Assert(item != null);
        _itemsStack.Push(item);
    }
}

Upvotes: 8

Chris Hannon
Chris Hannon

Reputation: 4134

There are a few problems with what you're doing, but your specific race condition is likely caused by a situation like the following. Imagine you have a capacity of one.

1) There is one unused item in the pool.

2) Thread #1 grabs it and signals the event.

3) Thread #2 finds no available event and gets inside the capacity block. It does not add the item yet.

4) Thread #1 returns the item to the pool and signals the event.

5) Repeat steps 1, 2, and 3 using two other threads (e.g. #3, #4).

6) Thread #2 adds an item to the pool.

7) Thread #4 adds an item to the pool.

There are now two items in a pool with a capacity of one.

Your implementation has other potential issues, however.

  • Depending on how your Pool.Count and Add() are synchronized, you might not see an up-to-date value.
  • You could potentially have multiple threads grab the same unused item.
  • Controlling access with an AutoResetEvent opens yourself up to difficult to find issues (like this one) because you are trying to use a lockless solution instead of just taking a lock and using Monitor.Wait() and Monitor.Pulse() for this purpose.

Upvotes: 2

Related Questions