Dendrobates
Dendrobates

Reputation: 3534

SQL Server Bulk import data from .csv into new table

I can't seem to find the answer to this quite trivial question.

I would like to bulk import data from a .csv file (with an unknown number of columns, comma separated) file to a new SQL Server table, within an existing database. The BULK INSERT statement works fine if the table is predefined, but since I don't know the number of columns of the .csv file upfront, this won't work.

I was trying to use bulk in combination with openrowset, but can't get it working.

By the way: SSIS won't be an option in this case, since I would like to incorporate the query within R (sqlquery) or Python.

Help would be highly appreciated!

Upvotes: 3

Views: 9544

Answers (2)

Kapil Gurjar
Kapil Gurjar

Reputation: 1

I am currently struggling with the same problem. I have first read the first row (headers) using bulk insert and created the table. Then again using bulk insert from row 2 imported data in the table. Although you have to change datatype after checking the data imported.

CREATE TABLE #Header(HeadString nvarchar(max))

declare @TableName nvarchar(100)='byom.DenormReportingData_100_100'
DECLARE @Columns nvarchar(max)=''
declare @Query nvarchar(max)=''
DECLARE @QUERY2 NVARCHAR(MAX)=''
bulk insert #Header
from 'F:/Data/BDL_BI_Test.csv'
with(firstrow=1,lastrow=1)

select @Columns=(select quotename(value)+' VARCHAR(500)'+',' from #Header cross apply string_split(HeadString,',') for xml path(''))
if isnull(@Columns,'')<>''
begin
set @Columns = left(@Columns,len(@Columns)-1)

select @Query=@Query+'CREATE TABLE '+@TableName+' ('+@Columns+')'
exec(@QUERY)
end

select @QUERY2 =@QUERY2+'bulk insert '+@TableName+' from ''F:/Data/BDL_BI_Test.csv'' 
with(firstrow=2,FORMAT=''csv'',FIELDTERMINATOR='','',ROWTERMINATOR=''\n'')'
exec(@QUERY2)

Upvotes: 0

Dendrobates
Dendrobates

Reputation: 3534

I have found a workaround, using R, to solve the problem above. The same logic can be applied in other languages. I advise everyone using this function to keep in mind the useful comments above.

I wrote a small function to capture the steps in R:

SQLSave <- function(dbhandle, data, tablename) { 
  # Export data to temp path, for example within your SQL Server directory.
  write.csv2(data,file = "\\\\pathToSQL\\temp.csv",row.names=FALSE,na="")

  # Write first 100 rows to SQL Server, to incorporate the data structure.
  sqlSave(dbhandle, head(data,100), tablename = tablename, rownames = FALSE, safer = FALSE)

  # SQL Query to remove data in the table, structure remains:
  sqlQuery(dbhandle,paste("DELETE FROM [",tablename,"]",sep=""));

  # SQL Query to bulk insert all data from temp .csv to SQL Server
  sqlQuery(dbhandle,paste("BULK INSERT [",tablename,"]
           FROM '\\\\pathToSQL\\temp.csv'
           WITH
           (
             FIELDTERMINATOR = ';',
             ROWTERMINATOR = '\\n',
             FIRSTROW = 2,
             KEEPNULLS
           )",sep=""));

  # Delete temp file from file directory
  file.remove("\\\\pathToSQL\\temp.csv")
}

Upvotes: 0

Related Questions