Reputation: 73
I have been working on a program to archive old client data for the company I work for. The program copies the data from the work server to the local machine, creates a .zip file of all the data, then copies it to the archive server.
After it does all that, it deletes the original files and the local copies. Every now and then, the program errors because it can't delete the local copies off my computer. It does not error every different folder that it zips. I will error after doing 300 of them, or after 5. It throws one of the 3 following errors,"The directory is not empty", "File is being used by another process", or "Access to the file is denied". I have tried setting the file attributes to normal, using a forced garbage collection, and ending the winzip process manually.
I really do not understand why it does this only sometimes. I am the admin on my computer and it should be able to delete the files. I figured something else has to be using it, but there should be nothing else using it on my machine except the program in Visual Studio. Thanks.
Below is the cleanup method where it is not deleting the files and the method that zips the files.
[MethodImplAttribute(MethodImplOptions.NoInlining)]
static void CleanUp(SqlConnection Connection, string jNumber, DirectoryInfo dir, bool error, string prefix)
{
if (!error | (!error & emptyFolder))
{
try
{
SqlCommand updateJob = new SqlCommand(string.Format("update job set archived = 1 where job = {0}", jNumber), sdiConnection);
updateJob.ExecuteNonQuery();
}
catch
{
WriteToLog("SQL Error: " + jNumber, "There was an error changing the archive bit to 1 after the job has been archived");
}
try
{
GC.Collect();
GC.WaitForPendingFinalizers();
}
catch
{
WriteToLog("Error cleaning up after processing job", "There was an error garbage collecting.");
}
try
{
//path of the temporary folder created by the program
string tempDir = Path.Combine(Path.Combine(System.Environment.CurrentDirectory, "Temp"), jNumber);
//path of the destination folder
string destDir = Path.Combine(dir.ToString(), jNumber);
//SetFileAttributes(tempDir);
try
{
File.Delete(tempDir + ".zip");
}
catch (System.IO.IOException)
{
File.Delete(tempDir + ".zip");
}
try
{
Directory.Delete(destDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(destDir, true);
}
try
{
Directory.Delete(tempDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(tempDir, true);
}
}
catch
{
WriteToLog("File Error: " + jNumber, "There was an error removing files and/or folders after the job has been archived. Please check the source server and destination server to make sure everything copied correctly. The archive bit for this job was set.");
Directory.Delete(Path.Combine(System.Environment.CurrentDirectory, "Temp"), true);
Directory.CreateDirectory(Path.Combine(System.Environment.CurrentDirectory, "Temp"));
}
}
static bool ZipJobFolder(string jNumber, string jPath)
{
try
{
string CommandStr = @"L:\ZipFiles\winzip32.exe";
string parameters = "-min -a -r \"" + jNumber + "\" \"" + jPath + "\"";
ProcessStartInfo starter = new ProcessStartInfo(CommandStr, parameters);
starter.CreateNoWindow = true;
starter.RedirectStandardOutput = false;
starter.UseShellExecute = false;
Process process = new Process();
process.StartInfo = starter;
Console.WriteLine("Creating .zip file");
process.Start();
process.WaitForExit();
Process[] processes;
string procName = "winzip32.exe";
processes = Process.GetProcessesByName(procName);
foreach (Process proc in processes)
{
Console.WriteLine("Closing WinZip Process...");
proc.Kill();
}
}
catch
{
WriteToLog(jNumber, "There was error zipping the files of this job");
return false;
}
return true;
}
Upvotes: 3
Views: 1468
Reputation: 23268
Antivirus software could be an issue because if antivirus software is currently reading your file it will cause that, to be honest I've seen this pop up many a time when using the .NET framework and I just toss the handling in a loop and attempt to do whatever file operation is needed until it no longer throws the exception. This also happens if a file is currently being copied or is being registered in the buffer of the kernel if some kind of watcher is implemented.
Upvotes: 0
Reputation: 12674
I have noticed this behavior using windows explorer, while deleting large folders with a lot of files and sub-folders. But after waiting a bit and then deleting again, it appears to work fine.
Because of that, I have always assumed it was a flaky behavior of the operating system.
Although this is not a solution, you could try it by making your application sleep for a small amount of time before attempting to delete those files, and see if the error occurs still.
If the errors go away it would appear to be related to some timing issue. I would myself want to know the source of the issue though.
Commenters are pointing to Anti Virus program. That would make sense, if that is true then you need to write some code to check if the file is locked before trying to delete. If it is locked then sleep for a bit, then check again until it is no longer locked and you can go ahead and delete it.
You just need to be careful not to get into an infinite race condition.
Edit: There is a related question about How to best wait for a filelock to release check it out for ideas.
Edit2: Here is another possible solution Wait until file is unlocked in .Net
Upvotes: 1
Reputation: 1137
One thing I found that helps is to not try to create temp files and directories in a single File.Move or File.Copy call. Rather, manually create the parent directories, starting at the highest level and working downwards. Finally, when all parent directories exist, Move or Copy the file.
Upvotes: 0
Reputation: 1026
Most likely you are getting a sharing violation - the delete can't get the exclusive file handle. One way this happens is the AV software gets triggered and doesn't finish scanning before the call to delete. It could also be that the WinZip process isn't fully dead yet.
Several ways to handle it: 1) On failure, sleep for a few seconds & try again. 2) I would probably not use WinZip & instead use ZipStorer (http://zipstorer.codeplex.com/). It will zip the file in the process and you won't have to do the kill step & you will have much more granular control. You could also do the zips in parallel by spinning up multiple threads.
Upvotes: 1