Reputation: 47560
In my windows application sometimes I have to run queries which return 10000s of records. I am using WCF services to retrieve data and some times it is very very slow because of the huge amount of data. I am using SQL Server 2008 database.
I would like to retrieve records 100 by 100 for my sql query which returns 10000s of records. Is there any way to handle such cases? Can a SQL cursor cater this?
Basically it's somewhat similar to paging or real time rending data on a grid. But I want to start displaying the records before everything comes client machine. When the amount is data is huge I am used to creating a BackgroundWorker for data-retrieval but I am not sure how to combine these two functionalities. It would be nice if you can show me a sample work.
Upvotes: 1
Views: 9370
Reputation: 10516
I had a similiar problem once where I had to stream milions of row over WCF. What worked for me is a chunking channel.
On the serverside your class implements something along these lines:
void InitQuery(QueryString); // initializes datareader that pulls result out of DB
YourClass[] GetChunk(int ChunkSize); // puts the next x elements from the reader into array. If it returns null you're done.
From the client you setup a query with the first, and then read chunks from the second. You don't have to worry about paging etc in the sql query, just get the next x records from the reader, put them into your DataContract class and return it. The SqlDataReader keeps track of where you are. (This might not scale too well with many users tho)
Make sure that you keep the proxy on the server between calls to the server because it maintains state. Look into SessionMode.
Also I got a huge performance boost by running it over TCP and fiddling with security settings.
You didn't mention what you're using, WinForms or WPF, but both support ways to lazy load data into a grid. You want to call GetChunk from a background thread, and when it has the results pass the whole array to the UI thread and then loop over it. (Do not loop over it from the background thread because you'll need to invoke for every row in the array. That would be slow)
Just some thing I learned he hard way, let me know if you need details anywhere.
Upvotes: 2
Reputation: 17508
I don't think there is a need to use a BackgroundWorker as the data needs to be loaded with the page. If you want your data to be displayed before the rest of the page, the only way I could imagine doing this is by sending an almost empty HTML page back to the client with some JS that executes on page load and then goes back to the server to get the grid data (via ajax) and the rest of the page. This could be a cumbersome and messy task.
An approach you could take is only loading the records you need for the first page of your grids data. You can think of Google search results, it doesn't load all those search results in to memory or your page for that matter, only the first page of data. When you need more results, it retrieves them.
To get records from your DB within a certain record range you could use a query like this which will get records 1 to 100.
SELECT UserID, FirstName
FROM (
SELECT ROW_NUMBER() OVER (ORDER BY UserID, FirstName) AS ROW_NUM, UserID, FirstName
FROM Users.Users
) as T1
WHERE T1.ROW_NUM BETWEEN 1 AND 100
ORDER BY T1.ROW_NUM
This SQL would obviously be put in to a stored procedure and called with parameters.
Upvotes: 1
Reputation: 7429
I may be wrong so instead of starting new thread i am posting my Question which may be answer to this post.
I think the essential thing is for user to have first glimpse of what he is trying to do like if he is fetching students record from databse and he has to get 1000 results then user would hate to wait for the query to be processed .for that i think what matters is you get data in chunks and display ASAP.
I think query should be divided into data ranges and fetch one small portion at a time but if you are using dot net frame work 4.0 then Parallel Loops would help in this
i have seen performance is good as compared to regular loops
Upvotes: 1
Reputation: 520
You can use the a Common Table Expression, with the Row_Number ranking function... Here is an example from some code I have writtten:
CREATE PROCEDURE PagingSample
@PageNumber int,
@PageSize int
AS
WITH Results AS (
SELECT
ROW_NUMBER() OVER(ORDER BY MR.MRN ASC) As RowNumber,
MR.MRN
FROM
dbo.SomeTable MR WITH (NOLOCK)
)
SELECT
R.RowNumber,
R.MRN
FROM
Results R
WHERE
RowNumber > (@PageNumber * @PageSize) - @PageSize
AND RowNumber < (@PageNumber * @PageSize) + 1
Now pass the page number and the size of your page to the sproc, like so:
Exec PagingSample @PageNumber = 3, @PageSize = 100
And you will get records 201 through 300
Upvotes: 2