Reputation: 459
I have a windows application in which a form is bound with the data.
The form loads slowly because of large data. I am also showing paging in form to navigate through the records.
How can I increase the performance?
Upvotes: 0
Views: 1985
Reputation: 9986
Bottom line- your app needs to 'page the data' effectively.
This means, you will need to "lazy load" the data. The UI should only load and display data that it needs to show; therefore load additional data only when needed.
Since you didnt provide much of an information regarding your app and the data that you load.
So, lets assume that your app fetches 10,000,000,01 records.
TOP 100
to fetch the top 100 records, and fill in your first page and next four pages.Next
or consecutive 'Nexts' you can hit database to fetch next records. Note that, you will require some mechanism(ROW_NUMBER
?) to keep track of the records being fetched, row numbers, etc. This article discusses exactly what you are after and I am referring towards.
Upvotes: 3
Reputation: 5374
Here are some recommendations:
Virtual mode is designed for use with very large stores of data. When the VirtualMode property is true, you create a DataGridView with a set number of rows and columns and then handle the CellValueNeeded event to populate the cells.
If you are using some other control, you can see if that control provides a similar feature. ListView also has VirtualMode.
Upvotes: 1
Reputation: 15139
If you are using SQL server, implement paging using the Commaon table Expressions and ROW_NUBER(). This will allow you to get less data from the Sql server and definitely better performance.
Upvotes: 0
Reputation: 10155
It's hard to say for certain without knowing more about your application, but the immediate thing that comes to mind is that if your dataset is large, you should be doing pagination on the database side (by constraining the query using row counts) rather than on the application side.
Databinding is a convenience feature of .NET, but it comes with a severe performance overhead. In general it's only acceptable for working with small datasets of less than a few thousand rows bound to a couple of dozen controls at most. If the datasets grow very large, they take their toll very quickly and no amount of tweaking will make the application speedy. The key is always to constrain the amount of memory being juggled by the data binding system at any given time so that it doesn't overload itself with the meta-processing.
Upvotes: 2