Reputation: 8237
lets say i have an event which gets fired like 10 times per secod
void Session_OnEvent(object sender, CustomEventArgs e)
{
//DoStuff
DoLongOperation(e);
}
i want the method DoLongOperation(e); to be processed on a seperated thread everytime the event gets fired,
i could do something like :
new Thread(DoLongOperation).Start(e);
but i have a feeling this is not good for performance, i wanna achieve the best performance, so what the the best thing i can do ?
thanks idvance..
edit:when i said long i didnt mean an opration that would take more than 1 sec maximum its just i dont want the event to wait that time so i wanna make that in seperated thread...
Upvotes: 4
Views: 1085
Reputation: 52675
If you're using C# 4.0 you may want to consider using the task scheduler. Since your DoLongOperation implies it will be long running you should considering the following
Long-Running Tasks
You may want to explicitly prevent a task from being put on a local queue. For example, you may know that a particular work item will run for a relatively long time and is likely to block all other work items on the local queue. In this case, you can specify the LongRunning option, which provides a hint to the scheduler that an additional thread might be required for the task so that it does not block the forward progress of other threads or work items on the local queue. By using this option you avoid the ThreadPool completely, including the global and local queues.
The other nice thing about using the TaskScheduler is that it has MaximumConcurrencyLevel. This allows you to adjust your concurrency relatively easily after doing the testing that Jon has recommended.
Here is a sample from MSDN that does just that
namespace System.Threading.Tasks.Schedulers
{
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
class Program
{
static void Main()
{
LimitedConcurrencyLevelTaskScheduler lcts = new LimitedConcurrencyLevelTaskScheduler(1);
TaskFactory factory = new TaskFactory(lcts);
factory.StartNew(()=>
{
for (int i = 0; i < 500; i++)
{
Console.Write("{0} on thread {1}", i, Thread.CurrentThread.ManagedThreadId);
}
}
);
Console.ReadKey();
}
}
/// <summary>
/// Provides a task scheduler that ensures a maximum concurrency level while
/// running on top of the ThreadPool.
/// </summary>
public class LimitedConcurrencyLevelTaskScheduler : TaskScheduler
{
/// <summary>Whether the current thread is processing work items.</summary>
[ThreadStatic]
private static bool _currentThreadIsProcessingItems;
/// <summary>The list of tasks to be executed.</summary>
private readonly LinkedList<Task> _tasks = new LinkedList<Task>(); // protected by lock(_tasks)
/// <summary>The maximum concurrency level allowed by this scheduler.</summary>
private readonly int _maxDegreeOfParallelism;
/// <summary>Whether the scheduler is currently processing work items.</summary>
private int _delegatesQueuedOrRunning = 0; // protected by lock(_tasks)
/// <summary>
/// Initializes an instance of the LimitedConcurrencyLevelTaskScheduler class with the
/// specified degree of parallelism.
/// </summary>
/// <param name="maxDegreeOfParallelism">The maximum degree of parallelism provided by this scheduler.</param>
public LimitedConcurrencyLevelTaskScheduler(int maxDegreeOfParallelism)
{
if (maxDegreeOfParallelism < 1) throw new ArgumentOutOfRangeException("maxDegreeOfParallelism");
_maxDegreeOfParallelism = maxDegreeOfParallelism;
}
/// <summary>Queues a task to the scheduler.</summary>
/// <param name="task">The task to be queued.</param>
protected sealed override void QueueTask(Task task)
{
// Add the task to the list of tasks to be processed. If there aren't enough
// delegates currently queued or running to process tasks, schedule another.
lock (_tasks)
{
_tasks.AddLast(task);
if (_delegatesQueuedOrRunning < _maxDegreeOfParallelism)
{
++_delegatesQueuedOrRunning;
NotifyThreadPoolOfPendingWork();
}
}
}
/// <summary>
/// Informs the ThreadPool that there's work to be executed for this scheduler.
/// </summary>
private void NotifyThreadPoolOfPendingWork()
{
ThreadPool.UnsafeQueueUserWorkItem(_ =>
{
// Note that the current thread is now processing work items.
// This is necessary to enable inlining of tasks into this thread.
_currentThreadIsProcessingItems = true;
try
{
// Process all available items in the queue.
while (true)
{
Task item;
lock (_tasks)
{
// When there are no more items to be processed,
// note that we're done processing, and get out.
if (_tasks.Count == 0)
{
--_delegatesQueuedOrRunning;
break;
}
// Get the next item from the queue
item = _tasks.First.Value;
_tasks.RemoveFirst();
}
// Execute the task we pulled out of the queue
base.TryExecuteTask(item);
}
}
// We're done processing items on the current thread
finally { _currentThreadIsProcessingItems = false; }
}, null);
}
/// <summary>Attempts to execute the specified task on the current thread.</summary>
/// <param name="task">The task to be executed.</param>
/// <param name="taskWasPreviouslyQueued"></param>
/// <returns>Whether the task could be executed on the current thread.</returns>
protected sealed override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued)
{
// If this thread isn't already processing a task, we don't support inlining
if (!_currentThreadIsProcessingItems) return false;
// If the task was previously queued, remove it from the queue
if (taskWasPreviouslyQueued) TryDequeue(task);
// Try to run the task.
return base.TryExecuteTask(task);
}
/// <summary>Attempts to remove a previously scheduled task from the scheduler.</summary>
/// <param name="task">The task to be removed.</param>
/// <returns>Whether the task could be found and removed.</returns>
protected sealed override bool TryDequeue(Task task)
{
lock (_tasks) return _tasks.Remove(task);
}
/// <summary>Gets the maximum concurrency level supported by this scheduler.</summary>
public sealed override int MaximumConcurrencyLevel { get { return _maxDegreeOfParallelism; } }
/// <summary>Gets an enumerable of the tasks currently scheduled on this scheduler.</summary>
/// <returns>An enumerable of the tasks currently scheduled.</returns>
protected sealed override IEnumerable<Task> GetScheduledTasks()
{
bool lockTaken = false;
try
{
Monitor.TryEnter(_tasks, ref lockTaken);
if (lockTaken) return _tasks.ToArray();
else throw new NotSupportedException();
}
finally
{
if (lockTaken) Monitor.Exit(_tasks);
}
}
}
}
Upvotes: 2
Reputation: 19986
Use one thread to process your request, and enqueue work items for the thread from your event.
Concretely:
As a member objects of the class, do this:
List< CustomEventArgs > _argsqueue;
Thread _processor;
In the constructor of the class, do:
_argsqueue=new List< CustomEventArgs >();
_processor=new Thread(ProcessorMethod);
Define processormethod:
void ProcessorMethod()
{
while (_shouldEnd)
{
CustomEventArgs e=null;
lock (_argsqueue)
{
if (_argsqueue.Count>0)
{
CustomEventArgs e=_argsqueue[0];
_argsqueue.RemoveAt(0);
}
}
if (e!=null)
{
DoLongOperation(e);
}
else
{
Sleep(100);
}
}
}
And in your event:
lock (_argsqueue)
{
_argsqueue.Add(e.Clone());
}
You'll have to work out the details for yourself, for example, in form closing or in disposing of the object that is in question, you'll have to:
_shouldEnd=true;
_processor.Join();
Upvotes: 2
Reputation: 34820
The performance will depend heavily on several factors:
Ten times per second is a fairly high rate of activity. Depending on the duration of the execution, it might make more sense to use a separate process, like a service. The activity must obviously be thread-safe, meaning (in part) there is no resource contention. If two threads might need to update the same resource (file, memory location) you'll need to use locking. This can impede efficiency if not handled well.
Upvotes: 1
Reputation: 437854
The direct answer to your question is: use the managed thread pool by utilizing ThreadPool.QueueUserWorkItem
to push your operations to it. (You may want to take a look at the answer to the question "when do I use the thread pool vs. my own threads?").
However, look at the bigger picture: if all the operations you are starting take more than 100 msec to finish, then you are mathematically going to generate more work than you can handle. This is not going to end well no matter how you slice it. For example, if you create a separate thread each time then your process will run out of threads, if you use the thread pool then you will swamp it with work that it will never be able to finish, etc.
If only some of your operations end up being long, and most complete immediately, then you may have a chance for a practical solution. Otherwise, you need to rethink your program design.
Upvotes: 6
Reputation: 16048
Yes, you can do so. But when your event fires 10 times per second and you start 10 long running operations per second, you will run out of threads very quickly.
Upvotes: 0