Kornelije Petak
Kornelije Petak

Reputation: 9572

Inserting a large number of rows into SQL CE 4.0 with Entity Framework 4 (performance issue)

I have a small SQL CE 4.0 database with several tables, mapped using Entity Framework 4.

Here is the code I have

foreach (String designation in newItemDesignations)
{
    ParameterData defaultValue = PDM.GetDefaultParameterData(designation);

    // Fill the ItemParameterDBO object with the data
    ItemParameterDBO dbParam = new ItemParameterDBO();
    dbParam.ItemID = dbItem.ID;
    dbParam.Designation = designation;    
    dbParam.High = defaultValue.High;
    dbParam.Low = defaultValue.Low;

    database.ItemParameterDBOes.AddObject(dbParam);
}

database.SaveChanges();

This code happens 24 times and each time the newItemDesignations list contains exactly 525 elements. That's a total of 12600 rows to add.

The complete process lasts 509 seconds. I guess that's too much for 12600 rows.

I am aware that I am calling SaveChanges 24 times. At the moment, the application design does not allow me to put all inserts into a single transaction (with SaveChanges). However, take a look at what happens with the single transaction. 509 / 24 = 21 seconds, or a 40 ms per row.

I've checked my other code (other than adding to the database and saving changes). It takes total of 100 ms for all 12600 rows. That's 0.01% of complete time, so that's obviously not the problem. The 99.99% of the processing time is spent in EF4 AddObject and SaveChanges.

I am also aware that I am setting the ItemID property which is a foreign key. This is an integer so I guess it shouldn't matter much. But I wouldn't know.

Also note: there are no indexes set on any of the tables (except for primary/foreign keys)

Upvotes: 8

Views: 4702

Answers (3)

Adam Houldsworth
Adam Houldsworth

Reputation: 64537

To further ErikEJ's answer and your own sample, you can use the implementation of IDataReader over lists to stream the data into WriteToServer instead of duplicating the values via a DataTable. See this question:

Get an IDataReader from a typed List

I implemented this at work once, it doesn't appear to improve performance, but it appeared to reduce memory consumption.

Upvotes: 1

Kornelije Petak
Kornelije Petak

Reputation: 9572

Since the examples are scarce, here is the code I tested and it worked flawlessly. Thanks to ErikEJ's SqlCeBulkCopy library. A must have.

DataTable table = new DataTable();

table.Columns.Add(new DataColumn("A", typeof(int)));
table.Columns.Add(new DataColumn("B", typeof(String)));
table.Columns.Add(new DataColumn("C", typeof(Byte)));

for(int i = 0; i < 12000; i++)
{
    DataRow row = table.NewRow();
    row["A"] = "124324"
    row["B"] = "something";
    row["C"] = 15;

    table.Rows.Add(row);
}

String connString = @"Data Source = C:\Database.sdf";
SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString);
bulkInsert.DestinationTableName = "Items";
bulkInsert.WriteToServer(table);

Inserting my (OP) 12600 rows took less than 2 seconds.

This example is bad, it's not strongly typed, but it describes how to manually create a datatable and insert it into a database using SqlCeBulkCopy library (see the accepted answer for a link).

Upvotes: 10

ErikEJ
ErikEJ

Reputation: 41819

You could consider using my SqlCeBulkCopy library, to bypass EF http://sqlcebulkcopy.codeplex.com

Upvotes: 6

Related Questions