Niall
Niall

Reputation: 1621

Querying text file with SQL converts large numbers to NULL

I am importing data from a text file and have hit a snag. I have a numeric field which occasionally has very large values (10 billion+) and some of these values which are being converted to NULLs.

Upon further testing I have isolated the problem as follows - the first 25 rows of data are used to determine the field size, and if none of the first 25 values are large then it throws out any value >= 2,147,483,648 (2^31) which comes after.

I'm using ADO and the following connection string: Provider=Microsoft.Jet.OLEDB.4.0;Data Source=FILE_ADDRESS;Extended Properties=""text;HDR=YES;FMT=Delimited""

Therefore, can anyone suggest how I can get round this problem without having to get the source data sorted descending on the large value column? Is there some way I could define the data types of the recordset prior to importing rather than let it decide for itself?

Many thanks!

Upvotes: 0

Views: 161

Answers (1)

Charleh
Charleh

Reputation: 14002

You can use an INI file placed in the directory you are connecting to which describes the column types.

See here for details:

http://msdn.microsoft.com/en-us/library/windows/desktop/ms709353(v=vs.85).aspx

Upvotes: 1

Related Questions