Reputation: 411
I need to import a CSV into a SQL 2008 database. One of the fields is a directory, and apparently there are over 1600 rows that have a "," in the field name. This field is now broken out incorrectly in my database. There are over 100000 rows total.
How can I avoid this and modify my statement?
Here is the creation of the table:
create table tablename
(
Directory varchar(1000),
Name varchar(1000),
Size varchar(50),
CreationTime varchar(100),
LastAccessTime varchar(100),
LastWriteTime varchar(100)
)
Below is the code I used to import it:
BULK
INSERT tablename
FROM 'c:\EmailCSVs\myfile.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
An example in the directory field looks like this:
\\drive\share\nightly.175\savemail\user\(Monday, January 10, 2011, 500 PM)
And here is an example row in the CSV:
"\\drive\share\nightly.175\savemail\user\(Monday, January 10, 2011, 500 PM)","Archive #1.1.pst","271360","1/14/2011 8:37:05 AM","6/1/2011 12:00:38 AM","1/14/2011 2:45:27 PM"
Upvotes: 1
Views: 385
Reputation: 2514
Simplest way is to replace all the field separating commas in the csv file with the pipe char | then set
FIELDTERMINATOR = '|'
This is a standard technique used by many DBA's when the commas cant be escaped or removed
Upvotes: 2
Reputation: 15251
You might try to bulk insert using a format file. This allows you to specify different terminators for each field, among other things. For example, most fields might have the fieldterminator set to ,
whereas the offending comma-laden field might be ","
.
Another possibility is to open the csv in excel and create a linked server to that.
Upvotes: 0