Reputation: 1
To process data in a large array that is being filled fast, I made few threads to process it in parallel. Deleting the first element and logging the result has to be synchronised so I used AutoResetEvent. The issue is that at some point, when deleting the element, it throws an error that the 0 index is out of range even though it has elements in it. I have no idea how this is happening.
private void InitializeProcessingThreads()
{
for (int i = 0; i < numberOfThreads; i++)
{
ares[i] = new AutoResetEvent(false);
int index = i;
threads[i] = new Thread(() =>
{
try
{
while (true)
{
if (BigArray.Count < index + 1)
{
Debug.WriteLine("BigArray is smaller than " + (index + 1));
continue;
}
byte[] data = BigArray[index];
Processing(data);
// do other processing here
if (index > 0)
{
ares[index - 1].WaitOne();
}
else
{
ares[numberOfThreads - 1].WaitOne();
}
Debug.WriteLine(Thread.CurrentThread.Name + " Completed processing");
if (BigArray.Count != 0)
BigArray.RemoveAt(0); // Or call another thread to remove all the packets and do some general work
ares[index].Set();
if (index > 0)
{
ares[index - 1].Reset();
}
else
{
ares[numberOfThreads - 1].Reset();
}
}
}
catch (Exception e)
{
Debug.WriteLine(Thread.CurrentThread.Name + ", " + BigArray.Count + ", " + e);
throw;
}
});
threads[i].Name = "Thread" + i;
}
ares[numberOfThreads - 1].Set();
}
This circularly runs for ever.
I have no idea how this is happening. I tried to remove element only if size of array is not zero but it still runs and gives an error.
Upvotes: -1
Views: 85
Reputation: 4698
It looks like you need to use another solution for your task. Instead of deleting elements, use the array as a circular buffer with pointers to the first element and to the last element.
This will not only relieve you of the problem described but will also increase productivity.
Use a single object to maintain the buffer for multithreading purposes. Create a dispatcher method to queue requests and allocate the appropriate number of elements per request.
Upvotes: 0