Reputation: 9648
I have a method that has to execute sql scripts for many times. These scripts use for create tables, views, stored procedures, and functions on the database. I came up with this code which works fine with 100 files.
foreach (var folder in ActivityConstant.SourceFolders)
{
var destination = context.GetValue(this.DestinationPath) + folder;
// TODO: Execute all sql folders instead of each sql file.
// this is cmd command for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"
foreach (var file in Directory.GetFiles(destination))
{
var begin = DateTime.Now;
context.TrackBuildWarning(string.Format("Start exec sql file at {0}.",
DateTime.Now));
Process process = new Process();
process.StartInfo.UseShellExecute = false;
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.CreateNoWindow = true;
process.StartInfo.FileName = "sqlcmd.exe";
process.StartInfo.Arguments = string.Format("-S {0} -d {1} -i {2} -U {3} -P {4}",
sqlServerName,
databaseName,
file,
sqlUserName,
sqlPassword);
process.StartInfo.WorkingDirectory = @"C:\";
process.Start();
//process.WaitForExit();
context.TrackBuildWarning(
string.Format(
"Finished exec sql file at {0} total time {1} milliseconds.",
DateTime.Now,
DateTime.Now.Subtract(begin).TotalMilliseconds));
}
}
Now, I am moving on to the next step. I am testing it with our database which has around 600 files (tables, views, stored procedures, and functions) and it looks like the current code cannot deal with large amount of query like this.
As far as I have records, it takes between 3 to 5 minutes to run around 100 files. As the time I am writing this question, it has been 40 minutes for those 600 files.
I would like to know how can I improve my code. I also welcome any suggestion if it is better to use difference way to archieve the goal.
Upvotes: 0
Views: 835
Reputation: 392833
I suspect the most useful way to improve performance would be to move the execution of the actual SQL commands into the application.
This avoids
I'd look at either storing the scripts as resources, then executing then on an SqlConnection.
Think about transaction management, when applicable. The easiest way to handle this would be to still open a new connection for each 'script'. As long as you do this using the same connection string each time, the connection will be pooled automatically by the Framework.
Upvotes: 0
Reputation: 426
You can try to merge the files before executing them:
using (var merged = File.OpenWrite("merged.sql"))
{
foreach (var file in Directory.GetFiles(destination))
{
var data = File.ReadAllBytes(file);
merged.Write(data, 0, data.Length);
}
}
//continue with "merged.sql" as source file for the sqlcmd
Also, for this task there is a multiplicity of existing tools, google for "Database Deployment Tool" or some such.
Upvotes: 0
Reputation: 47038
You are starting 600 processes at the same time, that is not good for performance, they probably take a LOT of RAM, maybe swapping occurs?
It will also create lots of load on your SQL server, might be a memory issue there too.
Try to run your queries sequentially or at least only a few at a time.
Edit
As as side note you should consider using SqlCommand
instead of launching external processes.
Upvotes: 3
Reputation: 13931
If this is only one database per set of files - you shouldnt login every file/query.
Upvotes: 0