Reputation: 10037
I use RedGate SQL data compare and generated a .sql file, so I could run it on my local machine. But the problem is that the file is over 300mb, which means I can't do copy and paste because the clipboard won't be able to handle it, and when I try to open the file in SQL Server Management Studio I get an error about the file being too large.
Is there a way to run a large .sql file? The file basically contains data for two new tables.
Upvotes: 284
Views: 304193
Reputation: 11
in ubuntu local you can do like this...
sqlcmd -S tcp:localhost,1433 -U sa -P "password" -d databaseName -i "Documents/script.sql" -N -b -C
before this you should create databases then you put the database name instead databaseName.
Upvotes: 1
Reputation: 770
There is probably another way for all the fellows still encountering problems importing really large SQL dumps.
What also be considered when possible: If you have access to the server you could export the database in multiple parts, like first the structure, then per table (or related objects) an export of the data in smaller pieces, instead of one big file.
When you don't have access to server and/or required to use the existing big file, you could try to split them into parts with SQLDumpSplitter: https://philiplb.de/sqldumpsplitter3/.
Then import the pieces to get a full copy of the database.
Good luck, guys.
Upvotes: 0
Reputation: 1170
You can use this tool as well. It is really useful.
NB: Broken link, so have updated it.
Upvotes: 21
Reputation: 961
Run the script file
Open a command prompt window.
In the Command Prompt window, type: sqlcmd -S <ServerName\InstanceName> -i C:\yourScript.sql
Press ENTER.
Upvotes: 3
Reputation: 21
==> sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
I have successfully done with this command with 365mb sql file. this syntax runs in about 15 minutes. it helped me solve a problem that took me a long time to figure out
Upvotes: 2
Reputation: 11235
I had similar problem. My file with sql script was over 150MB of size (with almost 900k of very simple INSERTs). I used solution advised by Takuro (as the answer in this question) but I still got error with message saying that there was not enough memory ("There is insufficient system memory in resource pool 'internal' to run this query").
What helped me was that I put GO command after every 50k INSERTs.
(It's not directly addressing the question (file size) but I believe it resolves problem that is indirectly connected with large size of sql script itself. In my case many insert commands)
Upvotes: 2
Reputation: 39
Hope this help you!
sqlcmd -u UserName -s <ServerName\InstanceName> -i U:\<Path>\script.sql
Upvotes: 3
Reputation: 260
I am using MSSQL Express 2014 and none of the solutions worked for me. They all just crashed SQL. As I only needed to run a one off script with many simple insert statements I got around it by writing a little console app as a very last resort:
class Program
{
static void Main(string[] args)
{
RunScript();
}
private static void RunScript()
{
My_DataEntities db = new My_DataEntities();
string line;
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\ukpostcodesmssql.sql");
while ((line = file.ReadLine()) != null)
{
db.Database.ExecuteSqlCommand(line);
}
file.Close();
}
}
Upvotes: 8
Reputation: 373
Take command prompt with administrator privilege
Change directory to where the .sql file stored
Execute the following command
sqlcmd -S 'your server name' -U 'user name of server' -P 'password of server' -d 'db name'-i script.sql
Upvotes: 20
Reputation: 30021
From the command prompt, start up sqlcmd
:
sqlcmd -S <server> -i C:\<your file here>.sql
Just replace <server>
with the location of your SQL box and <your file here>
with the name of your script. Don't forget, if you're using a SQL instance the syntax is:
sqlcmd -S <server>\instance.
Here is the list of all arguments you can pass sqlcmd:
Sqlcmd [-U login id] [-P password]
[-S server] [-H hostname] [-E trusted connection]
[-d use database name] [-l login timeout] [-t query timeout]
[-h headers] [-s colseparator] [-w screen width]
[-a packetsize] [-e echo input] [-I Enable Quoted Identifiers]
[-c cmdend] [-L[c] list servers[clean output]]
[-q "cmdline query"] [-Q "cmdline query" and exit]
[-m errorlevel] [-V severitylevel] [-W remove trailing spaces]
[-u unicode output] [-r[0|1] msgs to stderr]
[-i inputfile] [-o outputfile] [-z new password]
[-f | i:[,o:]] [-Z new password and exit]
[-k[1|2] remove[replace] control characters]
[-y variable length type display width]
[-Y fixed length type display width]
[-p[1] print statistics[colon format]]
[-R use client regional setting]
[-b On error batch abort]
[-v var = "value"...] [-A dedicated admin connection]
[-X[1] disable commands, startup script, environment variables [and exit]]
[-x disable variable substitution]
[-? show syntax summary]
Upvotes: 529
Reputation: 871
I had exactly the same issue and had been struggling for a while then finally found the solution which is to set -a
parameter to the sqlcmd
in order to change its default packet size:
sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
Upvotes: 87
Reputation: 29527
The file basically contain data for two new tables.
Then you may find it simpler to just DTS (or SSIS, if this is SQL Server 2005+) the data over, if the two servers are on the same network.
If the two servers are not on the same network, you can backup the source database and restore it to a new database on the destination server. Then you can use DTS/SSIS, or even a simple INSERT INTO SELECT
, to transfer the two tables to the destination database.
Upvotes: 1
Reputation: 11138
Your question is quite similar to this one
You can save your file/script as .txt or .sql and run it from Sql Server Management Studio (I think the menu is Open/Query, then just run the query in the SSMS interface). You migh have to update the first line, indicating the database to be created or selected on your local machine.
If you have to do this data transfer very often, you could then go for replication. Depending on your needs, snapshot replication could be ok. If you have to synch the data between your two servers, you could go for a more complex model such as merge replication.
EDIT: I didn't notice that you had problems with SSMS linked to file size. Then you can go for command-line, as proposed by others, snapshot replication (publish on your main server, subscribe on your local one, replicate, then unsubscribe) or even backup/restore
Upvotes: 1
Reputation: 28499
Run it at the command line with osql, see here:
http://metrix.fcny.org/wiki/display/dev/How+to+execute+a+.SQL+script+using+OSQL
Upvotes: 3