Reputation: 41
When I am using foreach
loop or Parallel.ForEach
with lock, the data is coming out properly but when I am using Parallel.ForEach
with no lock
, the data is inconsistent and data loss happens. Also If instead of the model ApiFileItems
, if I am using each of the model item as parameter in the Parallel.ForEach
and removing the model, the output is consistent and no loss of data.
Now with lock
I am facing the issue is that the performance is very slow.
But when I am using Parallel.ForEach
loop without lock, the performance is fast but the data is inconsistent and data loss happens. Now I am stuck with it for almost 4 days not getting any solution to increase the performance.
private static void ParallelExecution(JArray ContentNode,
ApiFileItems apiFileItems, ApiTypeItem apiTypeItem)
{
Parallel.Foreach(ContentNode.Values(), new ParallelOptions(){}, (rootNode) =>
{
lock (ContentNode)
{
if (rootNode.HasValues)
{
ParallelRootItemExecution(rootNode, apiFileItems, apiTypeItem);
}
else
{
//Log the message
}
}
});
}
private static void ParallelRootItemExecution(JToken rootNode,
ApiFileItems apiFileItems, ApiItemType apiItemType)
{
Parallel.ForEach<JToken>(rootNode.Values(),
new ParallelOptions() {MaxDegreeOfParallelism = 4}, (metaNode) =>
{
lock (rootNode)
{
bool foundValue = false;
apiFileItems.relativeFilePath = metaNode["valueString"].ToString();
if (!foundFolderItems.TryGetValue(apiFileItems.relativeFilePath,
out foundValue))
{
foundFolderItems.TryAdd(apiFileItems.relativeFilePath, true);
ParallelExecution((String.FormatapiFileItems.relativeGroupUrl,
apiFileItems.hostName, apiFileItems.publicationId,
apiFileItems.relativeFilePath), apiFileItems,apiItemType);
}
}
});
}
When not using lock
then data loss happens, and the data is not consistent.
Upvotes: 1
Views: 531
Reputation: 7610
Parallel.Foreach
allows each iteration of a loop to be executed in parallel. But with lock, only one iteration can run at the same time. This is like a synchronous code, but with degraded performance because it must manage the lock.
A simple solution will be to do all synchronous with a classic foreach
.
If the performance is not satisfactory in synchronous, then you need to identify where the code can't be concurrent and lock only this part.
Upvotes: 2