Reputation: 675
Here is my dilemma,
8 table
Upto 72 columns
Including datatypes int, smallint, decimal ,smalldatetime, char
Nothing to scary there.
I have created unique data sets 100 rows, 1000 rows 10000 rows 50000 rows 100000 rows, 500000 rows for but they in insert statements. This is for volumetric testing.
When attempting to run the 50K query the sql server ran out of memory.
What are my options.! Will I need to breaking to small queries; can I commit every X rows and how can I determine the max row is there a better way than insert statements>? Excel and access ruled out due volume of data.
Has any one an example loop t-sql statement to create different types data?
Upvotes: 1
Views: 4006
Reputation: 286
Use SSIS and in DataFlowTask it has option to specify the batch size.
Upvotes: 0
Reputation: 19469
Since you are using .NET framework, you should probably look into the updateBatchSize parameter on the dataadapter. It will take care of a lot of the internals for you. More info @ msdn.microsoft.com/en-us/library/kbbwt18a(VS.80).aspx
Upvotes: 0
Reputation: 1496
BULK INSERT is exactly for this kind of scenario - loading bulks of data into SQL server. Why is it way more efficient and faster than normaln inserts? Because B-Trees are built from bottom-up instead of normal top-down insert-and-split approach.
Upvotes: 2
Reputation: 1559
"Has any one an example loop t-sql statement to create different types data?"
Loop is also available.
declare @i int
set @i = 0
while @i<50000
begin
set @i = @i+1
insert into XXX (@i, 1, 2, 3)
end
Upvotes: 0
Reputation: 1559
Split portions of insert statements with "GO" keyword. It will end one batch and begin another one. For easyness of test generation it will be normal to split every line with "GO".
insert into XXX (1, 2, 3, ...)
...
insert into XXX (2, 3, 4, ...)
GO
insert into XXX (3, 4, 5, ...)
...
insert into XXX (4, 5, 6, ...)
GO
or
insert into XXX (1, 2, 3, ...)
GO
insert into XXX (2, 3, 4, ...)
GO
insert into XXX (3, 4, 5, ...)
GO
Upvotes: 3