Reputation: 945
I'm getting the conversion error when I try to import a text file to my database. Below is the error message I received:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 4 (Year).
Here is my query code:
CREATE TABLE Students
(
StudentNo Integer NOT NULL Primary Key,
FirstName VARCHAR(40) NOT NULL,
LastName VARCHAR(40) NOT NULL,
Year Integer,
GPA Float NULL
);
Here is the sample data from text file:
100,Christoph,Van Gerwen,2011
101,Anar,Cooke,2011
102,Douglis,Rudinow,2008
I think I know what the problem is..Below is my bulk insert code:
use xta9354
bulk insert xta9354.dbo.Students
from 'd:\userdata\xta9_Students.txt'
with (fieldterminator = ',',rowterminator = '\n')
With the sample data, there is no ',' after the Year attribute even tho there is still another attribute Grade after the Year which is NULL
Can someone please tell me how to fix this?
Upvotes: 31
Views: 188641
Reputation: 11
The OP's example data does not have a header row, but if it did, the message would be presented because "Year" cannot be inserted into an integer column. This is the scenario that caused this same message for me (although, for me, it was "eligible_date" trying to go into a datetime column). I added the FIRSTROW = 2 parameter.
Upvotes: 0
Reputation: 21
My csv's blank columns had space character in blank cells after removing those out, it worked for me. Here is the script I used for Bulk insertion:
BULK INSERT <table_name>
FROM '<file path>\\test.csv'
WITH
(
FORMAT='CSV',
FIRSTROW=2,
FIELDTERMINATOR=',',
ROWTERMINATOR = '0x0a'
)
GO
Upvotes: 0
Reputation: 51
I was given data from Microsoft SQL and needed to import it to Azure SQL Edge.
I tried bcp, import data plugin for Azure Data Studio, bulk insert, but I always got some kind of an error that there was something wrong either with the file or with the data within.
What worked for me:
'null'
with NULL
Upvotes: 0
Reputation: 1182
My guess is it's an encoding problem, for instance your file is UTF-8 but SQL will not read it the way it should, so it attempts to insert 100ÿ or something along these lines into your table.
Possible fixes:
Code samples:
1.
BULK INSERT myTable FROM 'c:\Temp\myfile.csv' WITH (
FIELDTERMINATOR = '£',
ROWTERMINATOR = '\n',
CODEPAGE = 'ACP' -- ACP corresponds to ANSI, also try UTF-8 or 65001 for Unicode
);
2.
get-content "myfile.csv" | Set-content -Path "myfile.csv" -Encoding String
# String = ANSI, also try Ascii, Oem, Unicode, UTF7, UTF8, UTF32
Upvotes: 2
Reputation: 34055
Try using a format file since your data file only has 4 columns. Otherwise, try OPENROWSET
or use a staging table.
myTestFormatFiles.Fmt
may look like:
9.0 4 1 SQLINT 0 3 "," 1 StudentNo "" 2 SQLCHAR 0 100 "," 2 FirstName SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 100 "," 3 LastName SQL_Latin1_General_CP1_CI_AS 4 SQLINT 0 4 "\r\n" 4 Year "
(source: microsoft.com)
This tutorial on skipping a column with BULK INSERT
may also help.
Your statement then would look like:
USE xta9354
GO
BULK INSERT xta9354.dbo.Students
FROM 'd:\userdata\xta9_Students.txt'
WITH (FORMATFILE = 'C:\myTestFormatFiles.Fmt')
Upvotes: 28
Reputation: 11
Added MSSQLSERVER full access to the folder, diskadmin and bulkadmin server roles.
In my c# application, when preparing for the bulk insert command,
string strsql = "BULK INSERT PWCR_Contractor_vw_TEST FROM '" + strFileName + "' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\\n')";
And I get this error - Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 8 (STATUS).
I looked at my logfile and found that the terminator becomes ' ' instead of '\n'. The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error:
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)". Query :BULK INSERT PWCR_Contractor_vw_TEST FROM 'G:\NEWSTAGEWWW\CalAtlasToPWCR\Results\parsedRegistration.csv' WITH (FIELDTERMINATOR = ',', **ROWTERMINATOR = ''**)
So I added extra escape to the rowterminator - string strsql = "BULK INSERT PWCR_Contractor_vw_TEST FROM '" + strFileName + "' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\\n')";
And now it inserts successfully.
Bulk Insert SQL - ---> BULK INSERT PWCR_Contractor_vw_TEST FROM 'G:\\NEWSTAGEWWW\\CalAtlasToPWCR\\Results\\parsedRegistration.csv' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')
Bulk Insert to PWCR_Contractor_vw_TEST successful... ---> clsDatase.PerformBulkInsert
Upvotes: 1
Reputation: 352
We use the bulk insert as well. The file we upload is sent from an external party. After a while of troubleshooting, I realized that their file had columns with commas in it. Just another thing to look for...
Upvotes: 4
Reputation: 547
The above options works for Google big query file also. I exported a table data to goodle cloud storage and downloaded from there. While loading the same to sql server was facing this issue and could successfully load the file after specifying the row delimiter as
ROWTERMINATOR = '0x0a'
Pay attention to header record as well and specify
FIRSTROW = 2
My final block for data file export from google bigquery looks like this.
BULK INSERT TABLENAME
FROM 'C:\ETL\Data\BigQuery\In\FILENAME.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '0x0a',--Files are generated with this row terminator in Google Bigquery
TABLOCK
)
Upvotes: 3
Reputation: 2135
In my case, I was dealing with a file that was generated by hadoop on a linux box. When I tried to import to sql I had this issue. The fix wound up being to use the hex value for 'line feed' 0x0a. It also worked for bulk insert
bulk insert table from 'file'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0a')
Upvotes: 21