Karthik
Karthik

Reputation: 2399

efficient way to store large datatable into database

i have created a datatable dynamically from a text file .Now i need to store the values in database.i found some example using foreach and data row and it works fine

  foreach (DataRow row in dt.Rows)
  {
      // insert statement
  }

My doubt is that the data table may contain thousands of row, is it efficient to use a foreach loop or should i go for someother techniques like bulk copy(i have no idea about it so do help me).Thank you

Upvotes: 1

Views: 1423

Answers (2)

Mr Moose
Mr Moose

Reputation: 6344

I'd have to second SqlBulkCopy. I am using it to load millions or rows into an EAV DB via the IDataReader interface and it's done in less than a minute. An example of how I use it is as follows;

    private void SaveAll(List<MyBO> bos, IDbConnection conn, IDbTransaction trans)
    {
        using (GenericListDataReader<MyBO> reader = new GenericListDataReader<MyBO>((IEnumerable<MyBO>)bos))
        {
            using (SqlBulkCopy bcp = new SqlBulkCopy(
                           (SqlConnection)conn, SqlBulkCopyOptions.CheckConstraints | SqlBulkCopyOptions.FireTriggers,
                           (SqlTransaction)trans))
            {
                bcp.BulkCopyTimeout = Constants.BULK_COPY_TIMEOUT;

                bcp.DestinationTableName = "MainAttr";
                SqlBulkCopyColumnMapping mapID =
                new SqlBulkCopyColumnMapping("Id", "Id");
                bcp.ColumnMappings.Add(mapID);

                SqlBulkCopyColumnMapping mainId =
                    new SqlBulkCopyColumnMapping("Mainid", "MainId");
                bcp.ColumnMappings.Add(mainId);

                SqlBulkCopyColumnMapping mapCol =
                    new SqlBulkCopyColumnMapping("Attributecolumn", "AttributeColumn");
                bcp.ColumnMappings.Add(mapCol);

                SqlBulkCopyColumnMapping mapVal =
                    new SqlBulkCopyColumnMapping("Attributevalue", "AttributeValue");
                bcp.ColumnMappings.Add(mapVal);

                SqlBulkCopyColumnMapping mapLoadDate =
                    new SqlBulkCopyColumnMapping("Loaddate", "LoadDate");
                bcp.ColumnMappings.Add(mapLoadDate);

                SqlBulkCopyColumnMapping mapLoadBy =
                    new SqlBulkCopyColumnMapping("Loadby", "LoadBy");
                bcp.ColumnMappings.Add(mapLoadBy);

                SqlBulkCopyColumnMapping mapDetail =
                    new SqlBulkCopyColumnMapping("detailid", "DetailId");
                bcp.ColumnMappings.Add(mapDetail);

                bcp.NotifyAfter = Constants.BULK_COPY_PROGRESS_REPORT;
                bcp.SqlRowsCopied += new SqlRowsCopiedEventHandler(bcp_SqlRowsCopied);

                bcp.WriteToServer(reader);

            }
        }
    }

By the way, if you are interested in finding out more about your options, then there is quite a good data loading guide on MSDN; http://msdn.microsoft.com/en-us/library/dd425070(v=sql.100).aspx

Edit: With regards to a futher example of the GenericListDataReader, check out the following site; http://www.csvreader.com/posts/generic_list_datareader.php

Upvotes: 3

David
David

Reputation: 73564

If you're using SQL Server, you should use the SqlBulkCopy class.

I converted from a loop like you described to using this class on a large set of data in one of my apps, and the performance boost was unbelievable.

Upvotes: 5

Related Questions