Reputation: 76
I have problem for reading data from huge SQL table which contain 420 column and 13500 row.
I want to convert table to 3 column by this code i solved my problem to converting
foreach (var item in mytable)
{
var result = db.mytable.Where(t => t.TaskID == item.TaskID).Select(e => e).Single();
for (int i = 1; i <= 200; i++)
{
try
{
if (result != null)
{
valmat = result.GetType().GetProperties().Where(a => a.Name == "_" + i.ToString()).Select(p => p.GetValue(result, null)).FirstOrDefault().ToString();
}
if (valmat!=null)
{
WeekTB week = new WeekTB();
week.ColumnID = Convert.ToInt32(i);
week.Taskid = Convert.ToInt32(item.TaskID);
week.ValueWeek = Convert.ToDouble(valmat);
//db.WeekTBs.InsertOnSubmit(week);
Console.WriteLine("taskID:" + item.TaskID + " Column:" + i + " Value:" + valmat);
valmat = null;
}
}
catch (Exception ex)
{
}
}
I don't have any problem for reading data but my problem is slowing in reading data because my table have a lot null and it take to long to get value?!
Upvotes: 0
Views: 63
Reputation: 76
i solved my problem by converting Excel file to SCV file .it too fast for Excel by 14000 row and 500 column when i using read excel by c# it take 3 hour but when changed Excel file to SCV file c# reed all data just in 3 sec
Upvotes: 0
Reputation: 10790
My ultimate advice is to change your database design. But if it is not possible you can start working with this line :
valmat = result.GetType().GetProperties().Where(a => a.Name == "_" + i.ToString()).Select(p => p.GetValue(result, null)).FirstOrDefault().ToString();
Your type is same for all the iterations and you are enumerating properties and filtering them for every iteration. I would suggest something like this :
var columnNames = Enumerable.Range(0,200).Select(t=> "_"+t).ToArray();
var properties typeof(MyTable).GetProperties()
.Where(t=>columnNames.Contains(t.Name))
.ToDictionary(t=> t.Name); // outside of the loop
valmat = properties["_"+i].GetValue(result,null); //inside of the loop
Upvotes: 1