Reputation: 2584
I am trying to fetch blocks of rows from my Sybase DB (ie. fetch 100 rows at a time per transaction).
However, I have read here and here that T-SQL simply does not support this.
Is there any work around for this?
Also, if it is possible to fetch multiple rows at a time, how could I modify the following code to do so:
DECLARE my_cursor CURSOR FOR
SELECT
COL1,
COL2,
...
COLN
FROM
MY_DB
WHERE
SOME_CONDITION_SATISFIED
GO
DECLARE
VAR1 TYPE,
VAR2 TYPE,
...
VARN TYPE
SET NO COUNT ON
OPEN my_cursor
WHILE @@SQLSTATUS = 0
BEGIN
FETCH my_cursor into
@VAR1,
@VAR2,
...
@VARN
END
CLOSE my_cursor
DEALLOCATE CURSOR my_cursor
Any help would be appreciated.
Upvotes: 1
Views: 7295
Reputation: 106
you can fetch multiple rows at time in sybase. use below to fetch 100 rows at a time
set cursor rows 100 for cursor_name
Upvotes: 0
Reputation: 6703
To the database programmer, SQL Server treats cursors just like Sybase and there's nothing wrong with your code. In T-SQL you basically don't have control on the pace the server fetches physical rows to populate the cursor. To the programmer point of view it shouldn't matter though.
If you need this fine grained control over the network usage, you'd better use other APIs better suited for ETL process, such as SSIS.
If your problem is to prevent database contention because of an open transaction, though, you can use transaction isollation level read uncommitted
(not to lock the table) or insert some transaction management logic with a counter, as:
....
DECLARE @counter INT; SET @counter = 0;
BEGIN TRANSACTION; -- OPEN THE FIRST TRANSACTION
OPEN my_cursor
WHILE @@SQLSTATUS = 0
BEGIN
-- control transactions: commit at every 100 and opens another transaction
IF @counter > 99
BEGIN
SET @counter = 0;
COMMIT TRANSACTION;
BEGIN TRANSACTION;
END
ELSE SET @counter = @counter +1;
FETCH my_cursor into
@VAR1,
@VAR2,
...
@VARN
END
COMMIT TRANSACTION; -- CLOSE THE LAST TRANSACTION
....
Upvotes: 2