Reputation: 414
Hi I am trying to migrate some projects to ASP.Net Core and having some issues with what is the correct way to upload a .CSV file to my SQL database.
In the past I would upload the .CSV file loop through it and store that information in a DataTable and then simply do a SqlBulkCopy.WriteToServer but I have found out that currently there is not a DataTable object in .Net Core. SQLBulkCopy accepts a DbDataReader instead of a DataTable.
What is the best method to get this to work. I have the file uploaded and I can work with the contents I just need to get the data in the proper format to do an upload to my SQL database.
Open to any ideas. Thanks.
Upvotes: 4
Views: 5318
Reputation: 51
I have solved it using FastMember's objectreader and and a columnmapping like this:
List<Data> lstData = new List<Data>();
//fill list of data..
using (var sqlCopy = new SqlBulkCopy(Config.get("Data:db:ConnectionString")))
{
sqlCopy.DestinationTableName = "[tblBulkCopy]";
sqlCopy.BatchSize = 500;
//map property index to column index in table. in database table the first column is the identitycolumn
sqlCopy.ColumnMappings.Add(0, 1);
sqlCopy.ColumnMappings.Add(1, 2);
sqlCopy.ColumnMappings.Add(2, 3);
var records = lstData.Select(obj => new { col1 = obj.number1, col2 = obj.number2, col3 = obj.number3 });
using (var reader = ObjectReader.Create(records, new[] { "col1", "col2", "col3" }))
{
sqlCopy.WriteToServer(reader);
}
}
Upvotes: 4