NoobCoder
NoobCoder

Reputation: 21

What is the fastest way to insert large amount of CSV data into mysql database using C#?

I am trying to insert a data file from excel pivot table into mysql using C#. I am using phpmyadmin. I was able to use the insert statement to insert the data. However, I am finding the performance for the insertion is extremely slow for my purposes as the data file will contain at least 15000 rows and this operation will have high number of occurrences. Currently I have an array list which holds the data before insertion. I was reading online and found that I should be using Load Data Infile and have my data in CSV file. My CSV file does not have all the same amount of columns as my SQL table. I am not sure how to pass the address to the CSV file and I don't want the whole operation to fail if one of the rows is a duplicate. I want to use the in file method if it can help my situation.

The error I am getting now is: File not found (Errcode: 22 "Invalid argument")

Here is my code so far for attempting the in file method.

       OpenFileDialog ofd3 = new OpenFileDialog();

        if (ofd3.ShowDialog() == DialogResult.OK)
        {
            DirectoryInfo di = new DirectoryInfo("../../files/");

            string hours = ofd3.FileName;


            string cmd = "LOAD DATA INFILE '" + hours + "' INTO TABLE timesheet FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'";

            mysql_insert_infile(cmd);
            }








   private void mysql_insert_infile(string query)
    {


        using (var connection = new MySqlConnection("Server=localhost;Database=projectdash;Uid=root;Pwd=;allow zero datetime=yes;Allow User Variables=True"))
        using (var cmd = connection.CreateCommand())
        {
            connection.Open();


            cmd.CommandText = query;



            try
            {
                cmd.ExecuteNonQuery();
            }
            catch (Exception w)
            {

                //   MessageBox.Show(w.Message);

            }



        }


    }

Upvotes: 1

Views: 679

Answers (1)

Balbinator
Balbinator

Reputation: 220

My bet would be to load the entire file in memmory and then create a "BULK" insert command containing at least 1000 rows.
You can clean your data to avoid duplicates and then create the command using only teh columns you need.
Just remember to insert multiple rows as described here:

Inserting multiple rows in mysql

Upvotes: 1

Related Questions