Reputation: 2700
Data are given to me in a spread sheet which I'm converting to pipe-delimited files. I tried inserting over 46000 rows but four (4) of them failed with the error message
SQLState = 22001, NativeError = 0 Error = [Microsoft][SQL Native Client]String data, right truncation
It would be nice to know which rows fail. Is there a way I can either force these rows to be inserted with the truncated strings, or to determine which ones fail with BCP?
bcp TableName in MyCreatedCSVFile.csv -T -c -t^|
<- I had to escape the | character on the command-line with ^.Is there a better way to do this?
Upvotes: 2
Views: 2806
Reputation: 1
I managed to do this by creating an "Instead of" trigger.
I created a table with varchar(max) to ensure I didn't get the truncates. I then created an instead-of trigger on the insert (so this 'proxy' table never get rows).
In the trigger I do a convert(varchar(50),name255). Ie: doing the truncate explicitly.
However, this means that you need to specify "FIRE_TRIGGER" with your BCP command.
How this helps.
Upvotes: 0
Reputation: 432521
Are you using a staging table?
You'd normally use one with generous column lengths etc then perform validation and quality checks. Even a simple
INSERT Finaltable (..)
SELECT (..), LEFT (offendingcolumn, 100) FROM Stagingtable
A stored proc can be used too to do a BULK INSERT then some checks then an INSERT into the final table.
Upvotes: 1
Reputation: 17010
You can set the number of errors high enough to cover the exception and then specify an error file. You then handle the exceptions one at a time. This is the cleanest method. If you are willing to use SSIS, you can get more elaborate in your workflow and handle the truncation and perhaps logging, but you lose the benefit of bulk copying.
Upvotes: 0