whytheq
whytheq

Reputation: 35557

Several users requesting 30,000 rows of information simultaneously

I have a test bed application available to 3 users.

If all users run the app at the same time, then a stored procedure runs, and all three will have result sets, currently using ADO.NET Datatable of approx 30,000 records which the app then needs to move into an Excel template for each of them.

The template is an xlsm file which contains some VBA that needs to run after importing the data. This template is saved in the solution.

I'm going to attempt to move the data from theDataTable to a worksheet using Excel-Interop

Has anyone any experience of moving this amount of data from Datatable into Excel?

@slugster suggested "setting up a datasource from Excel and just run the query using a dirty read" ... is it possible to set up a datasource in Excel linked to a non-materialized datatable?

Will looping through a table that is 30000 rows by 10 columns via xl interop run into problems?

Upvotes: 3

Views: 411

Answers (1)

cycoder
cycoder

Reputation: 26

Has anyone any experience of moving this amount of data from Datatable into Excel?

Not from a DataTable object, but that amount of data using Excel's built in ability to import data, yes.

@slugster suggested "setting up a datasource from Excel and just run the query using a dirty read" ... is it possible to set up a datasource in Excel linked to a non-materialized datatable?

I would suggest this as well. To go further, I'd suggest creating a stored procedure then calling that. You should see better performance using a stored procedure. The procedure could collect and prepare the data then return it to Excel. Also, you may be able to build in a caching mechanism inside the procedure. For example, if your data only changes daily, you only rebuild the data in the source table once per day, so only the first user to request the data takes an initial performance hit. Also, depending on what type of post processing you are doing in Excel in VBA, maybe that could be handled in the procedure as well. The procedure will also help reduce the possibility of locking issues if you add SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED to the top of the procedure, or use the (NOLOCK) hint on the tables you are willing to allow dirty reads from.

Here's a nice article regarding using stored procedures in Excel: http://codebyjoshua.blogspot.com/2012/01/get-data-from-sql-server-stored.html

Will looping through a table that is 30000 rows by 10 columns via xl interop run into problems?

This depends on your definition of "problems." I could see possible performance implications, however if you handle as much as you can in the stored procedure, you should be fine. In the world of data, that's really teeny tiny.

Upvotes: 1

Related Questions