Reputation: 153
I'm working on a pet project that will allow me to store my game collection in a DB and write notes on those games. The single entries of games has been coded by inserting desired variables into my game_information
table and outputting the PK (identity) of the newly created row from that table, so I can insert it into my game_notes table along with the note.
var id = db.QueryValue("INSERT INTO Game_Information (gamePrice, name, edition) output Inserted.gameId VALUES (@0, @1, @2)", gamePrice, name, edition);
db.Execute("INSERT INTO Game_Notes(gameId, notes, noteDate) VALUES (@0, @1, @2)", id, notes, noteDate);
I'm now playing with uploading data in bulk via csv but how can I write a BULK INSERT
that would output all PKs of the newly created rows, so I can inserted them into my second table (game_notes
) along with a variable called notes?
At the moment I have the following:
Stored Procedure that reads .csv
and uses BULK INSERT
to dump information into a view of game_information
@FileName nvarchar(200)
AS
BEGIN
DECLARE @sql nvarchar(MAX);
SET @sql = 'BULK INSERT myview
FROM ''mycsv.csv''
WITH
(
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n'',
FIRSTROW = 2
)'
EXEC(@sql)
END
C# code that creates set up in WebMatrix
if ((IsPost) && (Request.Files[0].FileName!=" "))
{
var fileSavePath = "";
var uploadedFile = Request.Files[0];
fileName = Path.GetFileName(uploadedFile.FileName);
uploadedFile.SaveAs(//path +filename);
var command = "EXEC Procedure1 @FileName = @0";
db.Execute(command, //path +filename);
File.Delete(//path +filename);
}
Which allows for csv records to be inserted into game_information
.
If this isn't feasible with BULK INSERT
, would something along the lines of be a valid solution to attempt?
BULK INSERT into a temp_table
INSERT from temp_table to my game_information table
OUTPUT the game_Ids from the INSERT as an array(?)
then INSERT the Ids along with note into game_notes.
I've also been looking at OPENROWSET
but I'm unsure if that will allow for what I'm trying to accomplish. Feedback on this is greatly appreciated.
Upvotes: 5
Views: 3778
Reputation: 153
Thank your for your input womp. I was able to get the desired results by amending my BULK INSERT
as follows:
BEGIN
DECLARE @sql nvarchar(MAX);
SET @sql=
'CREATE TABLE #Temp (--define table--)
BULK INSERT #Temp --Bulk into my temp table--
FROM '+char(39)+@FileName+char(39)+'
WITH
(
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n'',
FIRSTROW = 2
)
INSERT myDB.dbo.game_information(gamePrice, name, edition, date)
OUTPUT INSERTED.gameId, INSERTED.Date INTO myDB.dbo.game_notes(gameId, noteDate)
SELECT gamePrice, name, edition, date
FROM #Temp'
EXEC(@sql)
END
This placed the correct ids into game_notes
and left the Note
column of the table as Null
for those entries. Which meant I could run a simple
"UPDATE game_notes SET Notes = @0 WHERE Notes IS NULL";
To push the desired note into the correct rows. I'm executing this and the stored bulk procedure in the same If (IsPost)
, so I feel like I'm protected from the wrong accidental note updates.
Upvotes: 2
Reputation: 116977
You have a few different options.
Bulk inserting into a temp table and then copying information into your permanent tables is definitely a valid solution. However, based on what you're trying to do I don't see the need for a temp table. Just bulk import into game_information, SELECT your ID's to your application, and then do your update of game_notes.
Another option would be to insert your keys. You can allow for IDENTITY_INSERT to be on for your tables and just have your keys as part of the CSV file. See here: https://msdn.microsoft.com/en-ca/library/ms188059.aspx?f=255&MSPPError=-2147217396. If you did this then you could do a BULK INSERT into your Game_information table, and then do a second BULK INSERT into your secondary tables by using a different CSV file. Be sure to re-enable key constraints and turn IDENTITY_INSERT off after its finished.
If you need more particular control over the data you're selecting from the CSV file then you can use OPENROWSET but there's not enough details in your post to comment further.
Upvotes: 1