Reputation: 24403
So I have many threads that are feeding me input data, which must be processed by a single thread in order of arrival. Currently, all the input items wind up inserted in a queue, and read/writes to the queue are protected with the C# lock statement. However, over time, the CPU usage of the application rises to an unacceptable level, and the profiler says that the majority of the CPU time is being spent on the lock statement itself. Is there a more efficient synchronization method available in place of the lock, that supports many writers and one reader?
Upvotes: 4
Views: 1788
Reputation: 16162
If you are using .net version 4.0
you can then use the ConcurrentQueue
which is part of the ConcurrentCollections
, instead of the normal Queue
and then get rid of the lock when read/write your data to the queue, the ConcurrentCollections
are designed to be used to handle concurrent read/write with lock free code..
If you are not using 4.0
you can is to lock only if no other lock is heeled, you can achieve that by using Monitor.TryEnter
instead of lock
note that lock
itself is Monitor.Enter
and Monitor.Exit
combination.., sample implementation would be:
private readonly object _syncObject = new object();
private bool TryUpdate(object someData)
{
if (Monitor.TryEnter(_syncObject))
{
try
{
//Update the data here.
return true;
}
finally
{
Monitor.Exit(_SyncObject);
}
}
return false;
}
Upvotes: 2
Reputation: 1192
It might be a big change to your app, but you could consider making your queue external to your application (for example MSMQ) and then you could have your writer threads writing to that queue to their hearts content. Your reader could then just pick the items off when its ready. If the bulk of your cpu time is just on the lock around your queue (I assume you are not actually locking around the work on the items being put on the queue), then putting the queue exteral to your app could really help. Ideally you could also split the writing and reading into seperate processes.
Another thing to check is that the object your are locking on is not being used to lock somewhere else in your app. A monitor (the thing behind the lock statement) is probably the lightest weight thread sync method there is, so might be best to re-architect things to avoid locking in the same process that is doing the processing of items.
Upvotes: 1
Reputation: 51329
It sounds like the writers are contending with each other for the locks. Consider a model where each writer has its own queue, and where the reader uses the Peek method to read the first message off of each queue without removing it. The reader can then keep iterating between the queues, peeking the first item among the set of first items from each queue, and then removing and processing that first item. It will be slower than your current architecture, but should eliminate the lock contention among the writers.
A trivial example might look like:
public class TimestampedItem<T> : IComparable<TimestampedItem<T>>
{
public DateTime TimeStamp { get; set; }
public T Data { get; set; }
public int CompareTo(TimestampedItem<T> other)
{
return TimeStamp.CompareTo(other.TimeStamp);
}
}
public void ReadFirstFromEachQueue<T>(IEnumerable<Queue<TimestampedItem<T>>> queues)
{
while (true)
{
var firstItems = new List<TimestampedItem<T>>(queues.Select(q => { lock (q) { return q.Peek(); } }));
ProcessItem(firstItems.OrderBy(tsi => tsi.TimeStamp).First());
}
}
}
Upvotes: 4