Reputation: 481
I was trying myself on implementing a minimal thread safe blocking queue, what I came up with is:
class BlockingQueue<T>
{
private Queue<T> myQueue = new Queue<T>();
private SemaphoreSlim semaPhore = new SemaphoreSlim(0);
public void Enqueue(T t)
{
lock(myQueue)
{
myQueue.Enqueue(t);
semaPhore.Release();
}
}
public T Dequeue()
{
semaPhore.Wait();
lock(myQueue)
{
return myQueue.Dequeue();
}
}
}
I tried stress testing it with several producers and consumers at the same time enqueueing / dequeueing at random time intervals and it didn't fail.
However, if I look critically at the code, could something happen between the "semaPhore.Wait()" and the "lock(myQueue)" command?
Upvotes: 2
Views: 363
Reputation: 481
Using the suggestions I now came up with this, surprisingly it uses even less lines of code :)
class BlockingQueue<T>
{
private Queue<T> myQueue = new Queue<T>();
public void Enqueue(T t)
{
lock(myQueue)
{
myQueue.Enqueue(t);
Monitor.Pulse(myQueue);
}
}
public T Dequeue()
{
lock(myQueue)
{
Monitor.Wait(myQueue);
return myQueue.Dequeue();
}
}
}
Upvotes: -1
Reputation: 111860
Yes it could happen... A Thread.Abort()
would break this queue.
semaPhore.Wait();
// Here a Thread.Abort() happens
lock(myQueue) ...
Then what happens is that there is one element in the queue that no one will be able to recover, because there is one less "free" semaphore slot than queue items.
Other than this (and the fact that there is a BlockingCollection<>
written by experts in .NET 4.0), I will say that the code is correct. semaPhore.Release()
causes an implicit barrier (because internally it uses a lock
), so there are no problems of the order of writings in memory (first the Enqueue()
will be really done, then the Release()
will be done). Note that it will be under-performing, because each operation requires two lock
(yours plus the one of SemaphoreSlim
).
Upvotes: 2