Reputation: 3
i want to parse a CSV file, store it in a list then insert values from that list to a database . Here is the sample of my code. I'm still learning so if it is not clear then i will explain further. But the idea is to parse the csv file, row by row, and then insert each row in my database. Thank you in advance.
public class SKU : List<string[]>
{
public string SKU_ID { get; set; }
public string SKU_Name { get; set; }
public string Code { get; set; }
public string Product_Name { get; set; }
public string DistributionCenter_Name { get; set; }
internal static SKU ParseRow(string row)
{
var columns = row.Split(';');
return new SKU()
{
SKU_ID = columns[0],
SKU_Name = columns[1],
Code = columns[2],
Product_Name = columns[3],
DistributionCenter_Name = columns[4],
};
}
}
In the script i named each column like in the csv file and in my database.
My main is as following
class Programm
{
static void Main(string[] args)
{
var sku_info = ProcessCSV("skutest1.csv");
SqlConnection conn = new SqlConnection();
conn.ConnectionString = @"...";
foreach (var information in sku_info)
{
using SqlConnection connection = new SqlConnection(conn.ConnectionString);
string commandString = ("INSERT INTO SKU VALUES ('" + information.SKU_ID + " "+information.SKU_Name+" "+information.Code+" "+information.Product_Name+" "+information.DistributionCenter_Name+"')");
conn.Open();
SqlTransaction transaction = conn.BeginTransaction();
SqlCommand cmd = new SqlCommand(commandString, conn, transaction);
cmd.ExecuteNonQuery();
transaction.Commit();
}
Console.ReadKey();
}
private static List<SKU> ProcessCSV(string path)
{
return File.ReadAllLines("C:/.../skutest1.csv").Where(row => row.Length > 0).Select(SKU.ParseRow).ToList();
}
}
Upvotes: 0
Views: 1076
Reputation: 131403
The fastest way to load a file on the server would be to use BULK INSERT
, eg :
BULK INSERT someTable
FROM 'pathtofile`
WITH ( FORMAT = 'CSV' )
You can do something similar by using ADO.NET's SqlBulkCopy class to execute a BULK INSERT operation with data sent from the client. SqlBulkCopy expects either a DataTable or IDbDataReader parameter. You can use CsvHelper' CsvDataReader for this. The CsvDataReader parses a CSV file and produces the IDbDataReader
interface needed by SqlBulkCopy.
The code could be as simple as this :
using var txtReader = File.OpenText(pathToCSV);
var reader = new CsvReader(txtReader,CultureInfo.InvariantCulture);
using var dbReader = new CsvDataReader(reader);
var bcp = new SqlBulkCopy(cns);
bcp.ColumnMappings.Add("sku_id","sku_id");
bcp.ColumnMappings.Add("sku_name","sku_name");
...
bcp.DestinationTableName = table;
bcp.WriteToServer(dbReader);
Without the mappings SqlBulkCopy will send the columns in the order they appear in the file. If that doesn't match the order of the table columns, you'll get an error or worse, get mixed up data
Upvotes: 1
Reputation: 162
Here's a short tutorial on building a parameterized commandstring
- this is a much safer way of inserting to your database.
Here's a sample of how you could parameterize your insert:
string commandString = (@"INSERT INTO SKU VALUES (@sku_id, @sku_name, @code, @product_name, @distributioncenter_name");
conn.Open();
SqlTransaction transaction = conn.BeginTransaction();
SqlCommand cmd = new SqlCommand(commandString, conn, transaction);
cmd.Parameters.Add(new SqlParameter("@sku_id", information.SKU_ID));
cmd.Parameters.Add(new SqlParameter("@sku_name", information.SKU_Name));
cmd.Parameters.Add(new SqlParameter("@code", information.Code));
cmd.Parameters.Add(new SqlParameter("@product_name", information.Product_Name));
cmd.Parameters.Add(new SqlParameter("@distributioncenter_name", information.DistributionCenter_Name));
cmd.ExecuteNonQuery();
transaction.Commit();
Upvotes: 0