Alex Gordon
Alex Gordon

Reputation: 60691

Determine ROW that caused "unexpected end of file" error in BULK INSERT?

i am doing a bulk insert:

DECLARE @row_terminator CHAR;
SET @row_terminator = CHAR(10); -- or char(10)

DECLARE @stmt NVARCHAR(2000);
SET @stmt = '
  BULK INSERT accn_errors
   FROM ''F:\FullUnzipped\accn_errors_201205080105.txt''
   WITH 
      (
        firstrow=2,
FIELDTERMINATOR = ''|''  ,
ROWS_PER_BATCH=10000
   ,ROWTERMINATOR='''+@row_terminator+'''
   )'
exec sp_executesql @stmt;

and am getting the following error:

Msg 4832, Level 16, State 1, Line 2
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 2
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 2
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

is there a way to know on which ROW this error occurred?

i am able to import 10,000,000 rows without a problem and error occurs after that

Upvotes: 35

Views: 128103

Answers (11)

Dhaval Soni
Dhaval Soni

Reputation: 416

I have a CSV file that I import using Bulk

You need to create one table and all columns should be nullable and remove space in the last row, add only those columns that available in excel. And please do not create a primary column, this process is not Identity increment automatically that's why creating the error.

I have done a bulk insert like this:

CREATE TABLE [dbo].[Department](
    [Deptid] [bigint] IDENTITY(1,1) NOT NULL,
    [deptname] [nvarchar](max) NULL,
    [test] [nvarchar](max) NULL,
 CONSTRAINT [PK_Department] PRIMARY KEY CLUSTERED 
(
    [Deptid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, 
    ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

CREATE TABLE [dbo].[Table_Column](
    [column1] [nvarchar](max) NULL,
    [column2] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

BULK INSERT Table_Column
FROM 'C:\Temp Data\bulkinsert1.csv'
WITH (
    FIELDTERMINATOR = ',',
    ROWTERMINATOR='\n' ,
    batchsize=300000 
);

insert into [dbo].[Department] 
select column1,column2 from Table_Column

Upvotes: 0

Karl Hoaglund
Karl Hoaglund

Reputation: 643

I got around the problem by converting all fields to strings and then using a common FIELDTERMINATOR. This worked:

BULK INSERT [dbo].[workingBulkInsert]  
FROM 'C:\Data\myfile.txt' WITH (
   ROWTERMINATOR = '\n', 
   FIELDTERMINATOR = ',' 
)

My data file looks like this now:

"01502","1470"
"01504","686"
"02167","882"
"106354","882"
"106355","784"
"106872","784"

The second field had been a decimal type with no double-quote delimiter (like , 1470.00) . Formatting both as strings eliminated the error.

Upvotes: 0

Karl Hoaglund
Karl Hoaglund

Reputation: 643

I got around the problem if I converted all fields to string and then used a common fielddelimiter.

Upvotes: -1

Mhelboy
Mhelboy

Reputation: 49

I have a csv file that i import using Bulk

BULK INSERT [Dashboard].[dbo].[3G_Volume]
FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n'
)
GO

Usually I used this script and it has no problems but in rare occassions.

I encounter this error..

"The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error."

Usually, this happens when the last row have blank values(null).

You need to link your csv file in MS access db to check the data.. (If your csv is not more than 1.4million rows you can open it in excel)

Since my data is around 3million rows I need to use access db.

Then check the number of the last row with blanks and subtract the number of null rows to your total rows for csv.

if you have 2 blank rows at the end and the total number of rows is 30000005 The script will become like this..

BULK
INSERT [Dashboard].[dbo].[3G_Volume]
 FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n',
Lastrow = 30000003
)
GO

Cheers... Mhelboy

Upvotes: 6

stefano
stefano

Reputation: 43

the rows generating this error don't have CHAR(10) terminator or have unnecessary spaces

Upvotes: -4

feetwet
feetwet

Reputation: 3446

In my experience this is almost always caused by something in the last two lines. tail the import file and it should still give you the failure. Then open it in a full text editor that lets you see non-printing characters like CR, LF, and EOF. That should enable you to kludge it into working, even if you don't know why. E.g., BULK INSERT fails with row terminator on last row

Upvotes: 0

user2125311
user2125311

Reputation: 67

I ran into the same issue. I had written a shell script to create a .csv in Linux. I took this .csv to Windows and tried to bulk load the data. It did not "like" the commas.... Don't ask me why, but I changed to a * as a delimiter in the bulk import and performed a find and replace for comma with * in my .csv .. that worked.. I changed to a ~ as a delimiter, that worked... tab also worked- it didn't like the comma.... Hope this helps someone.

Upvotes: 1

user1588622
user1588622

Reputation:

To locate the troublesome row use the errorfile specifier.

BULK INSERT myData
FROM 'C:\...\...\myData.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
ERRORFILE = 'C:\...\...\myRubbishData.log' 
);

myRubbishData.log will have the offending rows and a companion file myRubbishData.log.txt will give you row numbers and offsets into the file.

Companion file example:

Row 3 File Offset 152 ErrorFile Offset 0 - HRESULT 0x80004005
Row 5 File Offset 268 ErrorFile Offset 60 - HRESULT 0x80004005
Row 7 File Offset 384 ErrorFile Offset 120 - HRESULT 0x80004005
Row 10 File Offset 600 ErrorFile Offset 180 - HRESULT 0x80004005
Row 12 File Offset 827 ErrorFile Offset 301 - HRESULT 0x80004005
Row 13 File Offset 942 ErrorFile Offset 416 - HRESULT 0x80004005

Upvotes: 58

Peter
Peter

Reputation: 1065

Yeah - BULK INSERT would have done will with a bit more detail in its error messages, and the only way around this is to use the brute force approach, as Gordon rightly pointed out. First, though, based on the error you're getting, it is either not understanding your row terminator, or there is a row terminator missing at the end of the file. Using FIRSTROW and LASTROW will help to determine that.

So, you need to do the following:

  1. Check that there is a row terminator at the end of the file. If not, put one in and try again. Also make sure that the last row contains all of the necessary fields. It it says 'EOF', then that is your problem.
  2. Are you sure there's a LF at the end of each line? Try a CR (\n, 0x0D) and see if that works.
  3. Still not working? Try setting LASTROW=2 and try again. Then try LASTROW=3. If you have more than three rows in your file and this step fails, then the row terminator isn't working.

Upvotes: 3

Steve Kass
Steve Kass

Reputation: 7184

If CHAR(10) is the row terminator, I don't think you can put it in quotes like you are trying to in BULK INSERT. There is an undocumented way to indicate it, though:

ROWTERMINATOR = '0x0A'

Upvotes: 4

Gordon Linoff
Gordon Linoff

Reputation: 1269445

Fun, fun, fun. I haven't found a good way to debug these problems, so I use brute force. That is, the FirstRow and LastRow options are very useful.

Start with LastRow = 2 and keep trying. Load the results into a throw-away table, that you can readily truncate.

And, you should also keep in mind that the first row could be causing you problems as well.

Upvotes: 6

Related Questions