Reputation: 9856
I have a "huge" result set from an SQL query. About 2-3 million rows of a column of type varchar(50). This might not be too big to be worried about. But, what if I have to process 10-15 such columns ?
So, that brings me to my question - Is there a way to retrieve a huge amount of data in small chunks using SQL, put them into memory, do processing with whichever language/tool you wish (I use C# and SSIS ETL tool) ?
If you feel this question needs to be clearer, please put a comment.
Upvotes: 1
Views: 1019
Reputation: 9927
DECLARE @PageN int = 0 -- 0..N
DECLARE @RowPerPage int = 20
SELECT *
FROM (
SELECT ROW_NUMBER() OVER (ORDER BY [LastLoginDate]) as rowNumber
,Customer.*
FROM dbo.Customer
--WHERE <Search cond>
) paging
WHERE rowNumber > @PageN * @RowPerPage
AND rowNumber <= (@PageN + 1) * @RowPerPage;
Paging for MS SQL 2008
Upvotes: 2
Reputation: 3671
Yes, that's possible. In C# you can use LINQ and as long as you iterate over the results (and avoid calling ToList()
or something similar) the rows from the resultset will be loaded one at a time. If you're not using LINQ, you can use a SqlDataReader instead.
Upvotes: 1