My2ndLovE
My2ndLovE

Reputation: 397

How to quickly delete then insert multiple records into database

I have an ASP.NET web application and I need to delete and insert multiple records in one shot.

Let's say a user clicks on a button. It will delete all the existing records for this user and insert a new set of records into the database.

For example, User A has 720 records in the database. When a user clicks on the Update button, it will delete the existing 700 records and insert a set of new records (let's say 854 records) into the database.

This function will be used frequently by the user.

I plan to do it in this way:

  1. Delete all records for a user

  2. Store all the records in a collection

  3. use a foreach to loop over the entire collection and insert records one by one

But I am afraid there might be a performance issue.

Is there any better way to do this?

Upvotes: 1

Views: 6231

Answers (6)

FAISAL
FAISAL

Reputation: 350

you can insert bulk rows in a database table from Datatable.

use following i have used many times:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.UI;
    using System.Web.UI.WebControls;
    using Microsoft.ApplicationBlocks.Data;
    using System.Data.SqlClient;
    using System.Data;
    using AssamExitPollReport;
    using System.IO;
    using System.Drawing;
    using Spire.Xls;
    using System.Data.Sql;

    public partial class Default : System.Web.UI.Page
    {
        DataSet ds = new DataSet();
        clsdata obj = new clsdata();
        protected void Page_Load(object sender, EventArgs e)
        {

       string tablename = Request.QueryString["tablename"];

       // Response.ContentType = "text/xml";  //Set Content MIME Type. 
        Response.Write("<h3>Execution Started on " + DateTime.Now.ToLocalTime()+"</h3>");
        bool flg=false;
        if (tablename != null)
        {
            flg = BulkInsertDataTable(tablename);
            if (flg)
                Response.Write("<br><h3>Successfully executed on " + DateTime.Now.ToLocalTime() + "</h3>");
            else
                Response.Write("<br><h3>Oops! Something is wrong.</h3>");
        }
        else
            Response.Write("<br><h3>Oops! @parameter \"tablename\" is missing.</h3>");
    }



    //public bool BulkInsertDataTable(string tableName, DataTable dataTable)
    public bool BulkInsertDataTable(string tablename)
    {
        bool isSuccuss=true;
        try
        {
            string client = "Server=databasehost\\SQLEXPRESS;Database=dbname;Uid=username;Pwd=yourpassword;Trusted_Connection=no";
            ds = obj.Executedatasetcount("select_tablename");
            obj.ExecuteDataset("delete_temp", new object[] { tablename });
            using (SqlConnection destinationConnection = new SqlConnection(client))
            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
            {
                destinationConnection.Open();
                bulkCopy.DestinationTableName = tablename;
                bulkCopy.WriteToServer(ds.Tables[0]);
                destinationConnection.Close();

            }
        }
        catch (Exception ex)
        {
            isSuccuss = false;
        }

        return isSuccuss;
    }

    private void elseif(bool p)
    {
        throw new NotImplementedException();
    }
}

The clsdata clase is used for executing the procedure :

using System;
using System.Data;
using System.Configuration;
using System.Linq;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.HtmlControls;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Xml.Linq;
using Microsoft.ApplicationBlocks.Data;

namespace AssamExitPollReport
{
    public class clsdata
    {
       // string conn = "Data Source=ADMIN-B19C3BADF;Initial Catalog=bazarkhodro;Integrated Security=True";



        public DataSet Executedataset(string spName)
        {
            return SqlHelper.ExecuteDataset(System.Configuration.ConfigurationManager.ConnectionStrings["conn1"].ConnectionString, CommandType.StoredProcedure, spName);
        }
        public DataSet ExecuteDataset(string spName, object[] values)
        {
            return SqlHelper.ExecuteDataset(System.Configuration.ConfigurationManager.ConnectionStrings["conn1"].ConnectionString, spName, values);
        }
        public DataSet ExecuteDataset(string spName, object[] values, string conn)
        {
            return SqlHelper.ExecuteDataset(conn, spName, values);
        }
        public int ExecuteNonQuery(string spName, object[] values, string conn)
        {
            int i = 0;
            try
            {

                i = SqlHelper.ExecuteNonQuery(conn, spName, values);

            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
            return i;

        }

        public int ExecuteScaler(string spName, object[] values)
        {
            int i = 0;
            try
            {
                i = Convert.ToInt32(SqlHelper.ExecuteScalar(System.Configuration.ConfigurationManager.ConnectionStrings["conn"].ConnectionString, spName, values));
            }
            catch (Exception)
            {

                throw;
            }
            return i;
        }

        public int ExecuteScaler(string spName, object[] values, string conn)
        {
            int i = 0;
            try
            {
                i = Convert.ToInt32(SqlHelper.ExecuteScalar(conn, spName, values));

            }
            catch (Exception)
            {

                throw;
            }
            return i;
        }

    }
}

where conn(connection string) is initialized in web.config

Upvotes: 0

paparazzo
paparazzo

Reputation: 45096

If you have a collection of objects then I would:
Put it in a transaction
Delete
Insert using Table Valued Parameter (TVP)
(and sort the insert on the PK to minimize fragmentation)

The insert via the TVP is kind of like a reverse DataReader
If you had raw data you were parsing and loading I would say bulkcopy
But you state you have a collection
This is great reference on how to use a collection with TVP

Just a simple insert one row at a time for less than 1000 should be pretty fast.
If it is a very active table and it is hard to get a write lock is where that would have problems.
Even with a simple insert you can pack up to 100 values (),().
But I would still go with TVP for anything over 100 if you are looking for speed.

Upvotes: 3

Morbia
Morbia

Reputation: 4344

You can use ORM like Entity Framework, which supports batch updates.

With Entity Framework you can do it as:

MyEntities dbContext = new MyEntities();
dbContext.Users.Add(new User(){Name="User1"});
dbContext.Users.Add(new User(){Name="User2"});
dbContext.SaveChanges();

To delete all records using Entity Framework:

dbContext.Database.ExecuteSqlCommand("delete from User");

you can do all changes in one batch:

//get all or needed users
List<User> users = dbContext.Users.ToList();

foreach (var user in users)
{
    if (CheckCondition) //update user
    {
        user.Name = "new name";
        dbContext.Entry(user).State = EntityState.Modified;
    }
    else if (CheckCondition) // delete user
    {
        dbContext.Entry(user).State = EntityState.Deleted;
    }
}
dbContext.Users.Add(new User() {Name = "Name3"}); //add new user
dbContext.SaveChanges(); //save all changes in one batch in a single transaction

Upvotes: 2

usr
usr

Reputation: 171206

If many of the records will stay the same or only need to be updated, you don't need a delete-insert pair. You can use the MERGE statement together with table-valued parameters to perform inserts, updates and deletes in one maximally efficient statement at once. This is likely to be the fastest way. MERGE is well-optimized to perform all writes at once in the best possible way.

My experience is that this technique works nicely in practice.

Upvotes: 0

reptildarat
reptildarat

Reputation: 1036

Why just not update the necessary field? well, if you worried about performance, use raw query instead using Big ORM. and judging from your need, I think it's better using transaction when you do the delete and insert operation.

here's that I think useful for you:

ado-net-sqltransaction-improves-performance

Upvotes: 1

Yosi Dahari
Yosi Dahari

Reputation: 6999

All You need to do is DELETE the user records before performing the INSERT:

Why not doing this as a database operation (run this code before inserting the user records):

DELETE FROM MyTable WHERE UserId = @UserId

About the fastest way to insert records you can find many posts:

How to do very fast inserts to SQL Server 2008

Fastest way to insert in parallel to a single table

Upvotes: 2

Related Questions