import flat file to SQL Server - Is there some kind of size / cell limit?

Can someone please confirm / deny if there is a cell limit (or any other kind of size restriction) when importing flat files into SQL Server?

I have been having an exasperating time recently, importing csv files which continually return the error "Could not convert to nvarchar, data would be truncated"... obviously that error message doesn't help, so I put error reporting on, and noticed that it routinely errored between rows 300 and 330... regardless of file.

I must've put 20 files in and got the same error, always at the same roughly the same row range. I have manually checked every file - there is no obvious cause for that error message in that cell range in any of the files.

My files aren't large (<1mb), contain mainly nvarchar data, and are limited to 500 rows... they are however quite wide (250 cols), leading me to wonder if there's a cell limit?

I tried chopping the files in two, so they were ~250 rows each, and - like magic - they import without issue.

Before I change the process that produces these files, can someone confirm if my suspicion is correct? (Or suggest an alternative cause?)

Edit - Using the built in "Import Flat File" Wizard in SQL Server 15.0. Comma separated; double-quotes; not sure re single quote in the string because the text is in Spanish and they don't use apostrophes!

Edit 2 - probably best to close this one now. A lot of logical suggestions which mirror the troubleshooting I've been doing. That is at least reassuring that it's not an obvious oversight.

Upvotes: 2

Views: 4546

Answers (4)

JustBeingHelpful
JustBeingHelpful

Reputation: 18980

This answer is this from the perspective of what size worked with SQL Server Management Studio (SSMS) v20.1 using the workflow >>> SSMS >>> Object Explorer >>> right click database >>> Tools >>> Import Flat File <<< with my computer. I didn't find the sweet spot, but if you break up the file, it'll work. The answer might depend on computer hardware specifications, computer software specifications, the current environment (ie: current memory usage, network constraints, sockets available, etc...), so there is probably not a correct answer for everyone.

This worked:

  • 304,104 KB (file size) / 2,592,754 (lines)

This NOT worked (I'm not speaking English here):

  • 1,210,450 KB (file size) / 7,197,114 (lines)

System Information:

enter image description here

Upvotes: 0

Joe Shark
Joe Shark

Reputation: 688

I would suggest loading the file to a staging area first so there's no issue with data type/ length of the existing fields in destination table.

When importing, check the 'advanced' details for each column after choosing data source. The wizard checks a sample of rows at the start of the table (can't remember how many) to estimate column length. It defaults to 50, so if the first x number of rows in your table have under 50 characters it will use that, not realising that subsequent rows in the table have more characters.

This may explain why splitting it in two solves the problem... the first part may have rows all in the lower range of characters (eg under 50), and when it processes the second part it does find the larger rows when looking at a sample to estimate column width and sets it accordingly. If both sizes are smaller than the destination column, there will be no issue with the actual INSERT.

You can set the OutputColumnWidth to a more suitable number.

Upvotes: 2

ASH
ASH

Reputation: 20302

Microsoft Excel has a character limit of 32,767 characters, you can read more about this limit here.

https://support.microsoft.com/en-us/office/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3

You can find the length of characters with the LEN() function. As for SQL Server, you can set your data type as VARCHAR and get it up to 8,000 characters. You can see some specs here.

https://www.sqlservertutorial.net/sql-server-basics/sql-server-nvarchar/

I'd suggest that you try importing a small sample of data first; save a copy of your file with just a few dozen rows of data. Make sure it works, and then try a few more times with larger and larger data sets, going from Excel to SQL Server. Whatever the error is, it should quickly become apparent to you, I think, as you go from small sets of data to larger ones.

Upvotes: 0

David Browne - Microsoft
David Browne - Microsoft

Reputation: 89071

Can someone please confirm / deny if there is a cell limit (or any other kind of size restriction) when importing flat files into SQL Server?

You are not hitting some sort of limit in SQL Server. It's something with your files or the tool you're using to load them.

Upvotes: -1

Related Questions