Reputation: 77
I am using Microsoft SQL Server Management Studio, I have minimal expertise in SQL Server but I have used MySQL in the past and pretty good at it.
My question is: I have a rather large database with over 130m+ records.
I wish to only get the URL column but every single record in this column.
Microsoft SQL Server Management Studio jams up when saving to a file due to one error or another so I have come to the conclusion that I need to do this in chunks.
I am running this command at the moment and saving to file
SELECT TOP 20000000 [URL]
FROM [dbo].[siteentry]
WHERE [Content] LIKE ''
However when this command finishes I have 0 clue how to skip the first 20,000,000 and continue onto the next batch of 20,000,000 without killing the server.
Any help would be great.
Upvotes: 8
Views: 7754
Reputation: 9221
If all you want is a text file containing all urls, then I suggest you to use bcp command line utility:
bcp "SELECT url FROM [dbo].[siteentry] WHERE [Content] LIKE ''" queryout Output.txt -S Server -T -d DatabaseName -c
It will generate the file at once, no need to split it in chunks, and you won't get annoying out of memory errors from SSMS
Upvotes: 1
Reputation: 69
You can use an "offset" to skip the rows. Check https://technet.microsoft.com/en-us/library/gg699618(v=sql.110).aspx for more information.
Upvotes: 2
Reputation: 754538
With SQL Server 2012, you can use the OFFSET...FETCH
commands:
SELECT [URL]
FROM [dbo].[siteentry]
WHERE [Content] LIKE ''
ORDER BY (some column)
OFFSET 20000 ROWS
FETCH NEXT 20000 ROWS ONLY
For this to work, you must order by some column in your table - which you should anyway, since a TOP ....
without an ORDER BY
is useless - you'll get back arbitrary rows
Upvotes: 12