Reputation:
I have a big SQL file that does not fit into memory and needs to be executed against Microsoft SQL Server 2008. It seems that the sqlcmd.exe
tool always loads it into memory first which is impossible in this case. Any ideas?
Unfortunately, I can't split the script because it is generated by Red Gate's excellent SQL Data Compare. The entire script is one big transaction and I want to leave it that way. I had never thought that having a gigantic script is unusual because having a lot of data is common in the database world. The script is 3gb in size.
Upvotes: 5
Views: 11660
Reputation: 64
Use sqlcmd
command
Example:
sqlcmd -S myServer\instanceName -i C:\myScript.sql
Upvotes: 0
Reputation: 331
The way I understand it, SSMS is 32-bit so it can't load a script over 1.5-2 GB. You can run the script in SQLCMD.exe, but then you may run into problems because of the transaction size -- SqlCmd will keep a whole transaction in memory. So what you can do then in SQL Data Compare is go into the options and use "split transactions", which may help.
SQL Data Compare will also do partial updates to BLOBs, which will solve the "enormous BLOB" issue.
This of course if based on the latest version of Data Compare. Some versions may not have these features.
Another option may be to use SQL Compare to create a schema script to a folder, then use SDC to sync the data into that folder. Then you have a file for each table rather than one massive file.
Hope this helps.
Upvotes: 0
Reputation:
I ran into this problem a few months ago. I generate sync scripts with SQLDataCompare on a weekly and monthly basis for several of our catalog databases and they are routinely larger than 500MB. My solution was writing a VBscript that chops the update script into 50 to 1000 command batches. The problem with this approach is losing the ability to roll back all changes if something breaks halfway into your database update.
Upvotes: 2
Reputation: 9281
I've had this problem before where the script had an enormous XML String that was being used with OpenXML. The actual SQL was rather minimal, updating some values in a table.
I ended up inserting the data (in chunks) into a temporary table until all the info that was in the XML was stored. Then I ran my update statement.
Added later after more data got posted:
You may want to select large chunks in the tool and have SQL Data compare generate the scripts in chunks. That way you get the transactions. You can select large sections by simply highlighting a range and hitting the space bar.
Upvotes: 1
Reputation: 103587
1-800-redgate-support.....
or
Upvotes: 0
Reputation: 33914
RedGate's SQL Compare has an option to execute the statements directly, instead of generating a SQL script and executing it later. Is there a reason this wouldn't work - in other words, is there a reason you require a SQL script and can't use the application's "synchronize now" functionality?
Upvotes: 3
Reputation: 12016
If it is that big, the script is either too complex or is repetitive. In either case, as others have suggested, the only sensible thing is to break it down into manageable chunks.
Is this a one-off exercise or a regular event?
Upvotes: 1
Reputation: 60408
What/who created the SQL script? Get whatever created the file to split the script up into logic chunks, by either transaction or statement (depending on how the file is structured). If the source can't do this, then whip up a script to split the file up logically.
Upvotes: 1