Reputation: 6348
There is a code here with 2 DataTables
. Both of them are filled with a SqlDataAdabter
from SQL Server 2012. There are 15.000 rows in dtParticipants
and 80.000 rows in dtSubjectResult
.
foreach (DataRow rowPart in dtParticipants.Rows)
{
int partId = int.Parse(rowPart["main_id"].ToString());
string partUserCode = rowPart["o_user_code"].ToString();
DataRow[] filteredRows = dtSubjectResult.Select("main_id = " + partId);
if (filteredRows == null)
continue;
foreach (DataRow row in filteredRows)
{
row["correct_count"] = 5;//unit.CorrectCount;
row["incorrect_count"] = 15;//unit.IncorrectCount;
row["empty_count"] = 20;//unit.EmptyCount;
row["net_count"] = 12;//unit.Total;
}//foreach 2
}//foreach 1
But this code is very very slow... When I comment codes inside second foreach, code runs very normal without any speed problem.
Interesting is that when I try this code like this, it works very well.
foreach (DataRow row in dtSubjectResult.Rows)
{
row["correct_count"] = 5;//unit.CorrectCount;
row["incorrect_count"] = 15;//unit.IncorrectCount;
row["empty_count"] = 20;//unit.EmptyCount;
row["net_count"] = 12;//unit.Total;
}
So what is the problem... IS looping through filtered rows slow? I don't know??
Upvotes: 1
Views: 643
Reputation: 35720
multiple filtering operation is a serious performance hit
DataRow[] filteredRows = dtSubjectResult.Select("main_id = " + partId);
Select
needs to interpret filter expression, scan 80k rows in dtSubjectResult
on each of 15k iteration
Join
2 tables and update dtSubjectResult
rows in one go:
var grouped = dtParticipants.AsEnumerable()
.Join(dtSubjectResult.AsEnumerable(),
rPart => (int)rPart["main_id"],
rSubj => (int)rSubj["main_id"],
(rPart, rSubj) => new { P = rPart, S = rSubj })
.GroupBy(x => x.P["main_id"]);
foreach (var gr in grouped)
{
foreach (var row in gr)
{
row.S["correct_count"] = 5;//unit.CorrectCount;
row.S["incorrect_count"] = 15;//unit.IncorrectCount;
row.S["empty_count"] = 20;//unit.EmptyCount;
row.S["net_count"] = 12;//unit.Total;
}
}
Upvotes: 1