Reputation: 3974
I have a very large sql insert file which is throwing "out of memory" errors when running it in SQL Enterprise Manager.
The advice I have seen is to add the "GO" command every X amount of rows to the inserts are "batched".
I am trying to write a small function to read the file and every 50 lines add a row with the text "GO"
The code I have written is also throwing System.OutOfMemoryException when I run it.
Can anyone suggest a better way of writing my code to fix this problem please?
This is what I have written:
public static void AddGo()
{
int currentline = 0;
string FilePath = @"C:\Users\trevo_000\Desktop\fmm89386.sql";
var text = new StringBuilder();
foreach (string s in File.ReadAllLines(FilePath))
{
// add the current line
text.AppendLine(s);
// increase the line counter
currentline += 1;
if (currentline == 50)
{
text.AppendLine("GO");
currentline = 0;
}
}
using (var file = new StreamWriter(File.Create(@"C:\Users\trevo_000\Desktop\fmm89386Clean.sql")))
{
file.Write(text.ToString());
}
}
Upvotes: 0
Views: 365
Reputation: 2203
You're keeping the file in memory and then writing it from memory to a file. Instead of doing that write the output file as you work through the input file; this sort of thing:
public static void AddGo() {
int currentline = 0;
string inputFilePath = @"C:\Users\trevo_000\Desktop\fmm89386.sql";
string outputFilePath = @"C:\Users\trevo_000\Desktop\fmm89386Clean.sql";
using (var outputFileStream=File.CreateText(outputFilePath)) {
foreach (string s in File.ReadLines(inputFilePath))
{
// add the current line
outputFileStream.WriteLine(s);
// increase the line counter
currentline += 1;
if (currentline == 50)
{
outputFileStream.WriteLine("GO");
currentline = 0;
}
}
}
}
Note the use of ReadLines on the input file, rather than ReadAllLines - see What is the difference between File.ReadLines() and File.ReadAllLines()? for more info on that.
Upvotes: 3