Xami Yen
Xami Yen

Reputation: 267

DataTable Takes Forever to Load Data from SqlDataReader

I am loading data from an MS SQL Server Table using the following code:

using (SqlDataReader rdr = cmd.ExecuteReader())
{
    if (rdr.HasRows)
    {
        dt.Load(rdr); //takes forever to load
    }
    if (dt.Rows.Count > 0 && !dt.HasErrors)
    {
        Parallel.For (0, dt.Rows.Count, i => 
        { 
            byte[] docBytes = (byte[])(dt.Rows[i]["DocObject"]);    File.WriteAllBytes(Path.Combine(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Documents\\"), $"{dt.Rows[i]["FileName"].ToString().ToLower()}"), docBytes); 
             });
        }
    }
}

The SQL query executes in less than one second. The data contains an SQL image column that holds binary document data. I used Stopwatch from System.Diagnostics to time the execution and found that this single dt.Load(rdr) statement is taking approximately 5 minutes to load about 5,000 records. My application needs to load several millions of rows and at this rate the app would be unusable. This is a Windows Forms application built using standard Windows Forms. Any ideas why dt.Load(rdr) takes forever? Any ideas on either rewriting this code or improving its performance would be greatly appreciated.

Upvotes: 4

Views: 922

Answers (2)

KumarHarsh
KumarHarsh

Reputation: 5094

Below code is untested.It is just an idea.

Another approach will be to define entity class and populate the list with SqldataReader.And Do not use DataTable at all.

Also one should close Database connection as soon as possible.So while fetching do not do other work.

Hope you are using connection pool in connection string

public class Example
{
public byte DocObject {get;set;}
public string FileName {get;set;}
}

List<Example> objList=new List<Example>();

using (SqlDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
      Example obj=new Example();

     obj.DocObject=(byte[])rdr["DocObject"] //suitable cast
     obj.FileName =rdr["FileName "].toSting() //suitable cast

       objList.Add(obj);
}
}

}

if (objList.Count > 0)
    {
        Parallel.For (0, objList.Count, i => 
        { 
            byte[] docBytes = (byte[])(objList[i]["DocObject"]);    File.WriteAllBytes(Path.Combine(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Documents\\"), $"{objList[i]["FileName"].ToString().ToLower()}"), docBytes); 
             });
        }
    }

Upvotes: 0

David Browne - Microsoft
David Browne - Microsoft

Reputation: 88881

Try something like this, instead of loading all the rows into memory on the client:

using (SqlDataReader rdr = cmd.ExecuteReader(CommandBehavior.SequentialAccess))
{ 
    while (rdr.Read())
    {
        string fn = rdr.GetString(0);
        using (var rs = rdr.GetStream(1))
        {
            var fileName = $"c:\\temp\\{fn}.txt";
            using (var fs = File.OpenWrite(fileName))
            {
                rs.CopyTo(fs);
            }
            Console.WriteLine(fileName);

        }
    }                
}

Upvotes: 6

Related Questions