Reputation: 686
Hy,
I am using bulk insert to insert records from a csv file in database. On some files i have a field wich is type mismatch or invalid character
but the SQL inserts the other rows except that one.
Ex : file with 7334 rows (1 row is type mismatch) -> 7333 rows inserted.
Is this the normal behavior of SQL or what is wrong.
Upvotes: 0
Views: 457
Reputation: 407
Yes this is the behaviour expected when you try to insert a data type that is different (in most cases) to the field(s) you are inserting data to. For an instance, you will get a type mismatch error if you try to insert a text value to a integer field.
bigint column won't let you insert a value like '00:12:2A:3F'. First check your source data and verify whether that column actually has the value '00:12:2A:3F'. If not, it could be the reason i explain in the note below. If the value is accurate with the source data you might have to change the column definition to a appropriate datatype such as varchar().
Unless you have forced CHECK_CONSTRAINTS Sqlserver will skip the erroneous records and inserts the other complying records. If you want the insert to fail on error (such as type mismatch) use CHECK_CONSTRAINTS on your bulk insert stament like this:
BULK INSERT DB.dbo.Test
FROM '.....'
WITH ( CHECK_CONSTRAINTS, .......)
Note: One thing to keep in mind is the data itself isn't polluted with field delimiters. For an instance if your fields are seperated by ',' delimiter and one of your fields has a comma within the data (a text column like 'comments' may have commas in them), SQL server think a new column start there and break the field causing the other column to be mis-aligned. That could be a reason for your type mismatch.
Upvotes: 1