Amit
Amit

Reputation: 23

SQL Server Dynamic SQL Query

I have a text file with about 400,000 records which have to be read, processed and inserted into a table. I'm using a stored procedure to do the same. The records are pipe separated as shown below

a | b | c .... 180 columns (1st record)
d | e | f .... 180 columns (2nd record)
.
.
.
x | y | z .....180 columns (4,00,00th record)

In the stored procedure, one insert statement was being fired for each record. I created a dynamic SQL query that would club 1,000 records in one insert but noticed that the execution time did not decrease. In fact, the SQL dynamic query created for a single record (includes isnull and cast functions for each column) takes more time than than the time taken to insert a single record into the table.

Is there a way I reduce the time taken to perform the task at hand?

EDIT The dynamic SQL query looks something like this (just a tiny snapshot)

CAST(GETDATE() AS VARCHAR(20)) + ''',' + 
CAST(@N_ER AS VARCHAR(20)) + ',' + 
CAST(@N_INSDE AS VARCHAR(20)) + ',' + 
CAST(@N_CODE AS VARCHAR(20)) + ',' + 
CAST(@NJOB_NUMBER AS VARCHAR(30)) + ',' + 
CAST(@NNUMBER AS VARCHAR(30)) + ',''' + 
ISNULL(DESTINATION,'') + ''',''' +  
ISNULL(@VPE_ID,'') + ''',''' + 
ISNULL(dbo.fn_NEW_CARD(@VAN),'') +

Or is there a way to improve the concatenation using some other set of functions maybe?

Upvotes: 1

Views: 119

Answers (1)

john McTighe
john McTighe

Reputation: 1181

Instead of using EXEC to run your Dynamic SQL - have you tried ExecuteSQL (with parameters) The advantage is that SQL can cache the query plan - which is a fairly significant saving on 400K inserts.

To be honest - SSIS is by far the best way to do it - Right click on the DB, select Tasks and Import Data then follow the wizard - you can even save the created package for later use in a Job.

Upvotes: 1

Related Questions