Roy Goldreich
Roy Goldreich

Reputation: 7

Is there a limit for number of rows that can be passed to SQL Server stored procedure in a table-valued parameter?

I'm using SQL Server 2019.

For some reason, when trying to pass a list of 10,000 ids to SQL Server from a C# web server as a table-valued parameter, only part of the ids (~7,000) end up being sent to the server (validated with trace).

Is this a known issue or are there configurations that I can set to ensure that all values are passed?

I have created the following type in SQL Server:

CREATE TYPE [dbo].[IntList] AS TABLE([ID] [int] NULL)

Then I have a stored procedure that accepts the type as a parameter:

CREATE PROCEDURE [dbo].[spStackOverflowQuestionExample]
    @Ids dbo.IntList READONLY,

C# server code: the parameter is created as follows:

var dt = new DataTable();
dt.Columns.Add("Id", typeof(int));

foreach (var id in ids)
{
    dt.Rows.Add(id);
}

parameters.Add(new SqlParameter { ParameterName = "@Ids", Value = dt, SqlDbType = SqlDbType.Structured });

The parameter is then added to a command, and executed with ExecuteReaderAsync.

I have read online that people have managed to pass large table valued parameters to stored procedures:

https://www.dbdelta.com/sql-server-tvp-performance-gotchas/

Are there other ways to do this?

I tried batching the request on the server into different calls but looking am still looking for a solution to avoid multiple round trips from server to the database.

Upvotes: -1

Views: 103

Answers (1)

Shreyans Padmani
Shreyans Padmani

Reputation: 9

It looks like your Table-Valued Parameter (TVP) isn't fully reaching SQL Server, which could be due to network packet size, command timeout, or memory limitations. Try increasing the packet size in your connection string (Packet Size=32767) and set a higher command timeout (command.CommandTimeout = 120). You can also consider using SqlBulkCopy for more efficient large inserts. Before executing, debug the DataTable to ensure all IDs are present (Console.WriteLine(dt.Rows.Count)). Lastly, check SQL Server's memory settings with EXEC sp_configure 'max server memory'. If the issue persists, try sending the data in smaller batches.

var connectionString = "your_connection_string;Packet Size=32767";
command.CommandTimeout = 120;
bulkCopy.WriteToServer(dt);

I think this solves the limit for the number of rows since there is no specific scenario to impose a restriction

Upvotes: -1

Related Questions