Reputation: 7525
I have the following code running under .Net 4.5
Parallel.For(0, fileArray.Length, i =>
{
DataRow dataRow = table.NewRow();
var dr = GetDataRow(fileArray[i], dataRow, parameters);
if (dr["MyVariable"].ToString() != "0")
{
try
{
table.Rows.Add(dr);
}
catch (Exception exception)
{
ConfigLogger.Instance.LogError(exception);
}
}
}
);
Seemingly at random this loop will max out the processor on the machine and stall making no more progress on the loop. This is processing 11k files and I can not get it to repeat using a smaller set of files. Does anyone have any ideas on how to debug this and figure out what is causing this? I cannot get it to replicate on my machine and the differences between my machine and production are as follows
Production Win 7 64 bit, .Net 4.5
Development Win 8 64 bit, .Net 4.5.1
Is there a way to place a timeout exception on each of the instances of the parallel.for loop?
Upvotes: 3
Views: 1041
Reputation: 127593
As mentioned in the comments, you need to use thread local data tables, Parallel
has built in support for that. Also there is no reason to use Parallel.For
, ForEach
would be much more appropriate for this situation.
Parallel.ForEach(fileArray,
() =>
{
lock(table)
{
//Create a temp table per thread that has the same schema as the main table
return table.Clone();
}
},
(file, loopState, localTable) =>
{
DataRow dataRow = localTable.NewRow();
var dr = GetDataRow(file, dataRow, parameters);
if (dr["MyVariable"].ToString() != "0")
{
try
{
localTable.Rows.Add(dr);
}
catch (Exception exception)
{
ConfigLogger.Instance.LogError(exception);
}
}
return localTable;
},
(localTable) =>
{
lock (table)
{
//Merge in the thread local table to the master table
table.Merge(localTable);
}
});
Upvotes: 3