Klaus Nji
Klaus Nji

Reputation: 18867

SQL Server: The bulk load failed. The column is too long in the data file for row 1, column 1

Someone please help me here. Been looking at this for a couple of hours now but leading to nowhere.

I created a table in SQL Express 2008 R2 using the following script:

CREATE TABLE Features
(
ID int not null identity(1,1 ),
StopID varchar(10), 
Code int,
Name varchar(100),
Summary varchar(200),
Lat real,
Lon real,
street varchar(100),
city varchar(50),
region varchar(50),
postcode varchar(10),
country varchar(20),
zone_id varchar(20),
the_geom geography


 CONSTRAINT [PK_Features] PRIMARY KEY CLUSTERED 
(
    [ID] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

Then I created the following format file created against my database table using the bcp tool:

10.0
12
1       SQLCHAR             2       100     ","    2     StopID               Latin1_General_CI_AS
2       SQLINT              1       4       ","    3     Code                 ""
3       SQLCHAR             2       100     ","    4     Name                 Latin1_General_CI_AS
4       SQLCHAR             2       200     ","    5     Summary              Latin1_General_CI_AS
5       SQLFLT4             1       4       ","    6     Lat                  ""
6       SQLFLT4             1       4       ","    7     Lon                  ""
7       SQLCHAR             2       100     ","    8     street               Latin1_General_CI_AS
8       SQLCHAR             2       50      ","    9     city                 Latin1_General_CI_AS
9       SQLCHAR             2       50      ","    10    region               Latin1_General_CI_AS
10      SQLCHAR             2       10      ","    11    postcode             Latin1_General_CI_AS
11      SQLCHAR             2       20      ","    12    country              Latin1_General_CI_AS
12      SQLCHAR             2       20      "\r\n"    13    zone_id              Latin1_General_CI_AS

This file has been modified to remove the ID and the_geom fields as these are not in my data file.

Then I tried to bulk insert a 1 line csv with the following content:

a,8,S,,45.439869,-75.695839,,,,,,

and all I get:

Msg 4866, Level 16, State 7, Line 35
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 35
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 35
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

Any pointers will help here as I cannot figure this one out.

Upvotes: 5

Views: 28957

Answers (5)

Daniel
Daniel

Reputation: 1

If you are generating your format file useing bcp be aware of the -n and -c flags.

My problem was that i had the -n (native) flag specified while my data file was only compatible with the -c (character) flag.

format file generation
bcp <DATABASE>.<SCHEMA>.<TABLE> format nul -t ; -T -c -S <IP> -f filename.fmt
data file generation
bcp "select * from table" queryout "filename.dat" -c -C 65001 -t ; -S <IP> -d <DATABASE> -T
bulk import

BULK INSERT dbo.table 
FROM 'filename.dat'
WITH (FORMATFILE = 'filename.fmt');

Upvotes: 0

user1465073
user1465073

Reputation: 325

i encountered this problem today, but only for specific ROWS with COLUMNS whose text value exceeds 8000 characters. Regardless if my FMT file was SQLCHAR 0 0 which indicated max, but somewhere along the pipeline, there is a max of 8000

I'm using AZURE SQL and trying to read CSVs in an Azure Blob container.

Upvotes: 1

Milan Saha
Milan Saha

Reputation: 935

Try this,

ROWTERMINATOR = '0x0a'

Upvotes: 9

Phillip Deneka
Phillip Deneka

Reputation: 241

For what it's worth, I had the same problem because a conflict between the expected format and actual format of the date field in my CSV. I changed the date format in my CSV, and the it worked.

Upvotes: 0

Klaus Nji
Klaus Nji

Reputation: 18867

Problem was caused by the default prefix length settings in my format file. The data file I am importing from was not created using bcp so I had to set the prefix length of all the fields to 0 as follows:

0.0
12
1       SQLCHAR             0       100     ","    2     StopID               Latin1_General_CI_AS
2       SQLINT              0       4       ","    3     Code                 ""
3       SQLCHAR             0       100     ","    4     Name                 Latin1_General_CI_AS
4       SQLCHAR             0       200     ","    5     Summary              Latin1_General_CI_AS
5       SQLFLT4             0       4       ","    6     Lat                  ""
6       SQLFLT4             0       4       ","    7     Lon                  ""
7       SQLCHAR             0       100     ","    8     street               Latin1_General_CI_AS
8       SQLCHAR             0       50      ","    9     city                 Latin1_General_CI_AS
9       SQLCHAR             0       50      ","    10    region               Latin1_General_CI_AS
10      SQLCHAR             0       10      ","    11    postcode             Latin1_General_CI_AS
11      SQLCHAR             0       20      ","    12    country              Latin1_General_CI_AS
12      SQLCHAR             0       20      "\r\n"    13    zone_id              Latin1_General_CI_AS

With this change, the import was succesful.

Upvotes: 10

Related Questions