Reputation: 315
I'm using Powershell for some file manipulation and performing the following steps:
This process works 99.9% of the time but sometimes I'm receiving an error when the remove-item step occurs informing me that the original file is already in use / locked. I can only assume that the process using this is the copy-item / test-path from steps 1 or 2.
Do I need to instruct Powershell to wait until the copy-item / test-path are finished before proceeding with the remove-item? Or is it more likely that there is another process running somewhere else such as on access AV that is causing this problem? If so, is there a method of easily retrying in the event of a file lock being in place?
Here is part of the code that performs the steps mentioned:
if ($copySuccess = "TRUE") {
$wiFiles = Get-ChildItem $xmlDir -Filter $itemNum*.*
$wiFiles | ForEach-Object {
if ($copySuccess -eq "TRUE") {
Copy-Item -LiteralPath $xmlDir\$_ -Destination $processedDir
if (!(Test-Path -LiteralPath $processedDir\$_)) {
$copySuccess = "FALSE"
}
}
}
}
if ($copySuccess -eq "TRUE") {
Get-ChildItem $xmlDir -Filter $itemNum*.* | Remove-Item
}
The files being moved are small text files or image files <1MB and typically <10 files per itemNum.
Thanks, Rob.
Upvotes: 3
Views: 9015
Reputation: 628
It is very unlikely that your cmdlets are causing the problem. if you want to make sure each cmdlet finishes before the next starts wrap each command in
$job1 = Start-Job { [Powershell code] }
Wait-Job $job1
For the delete command you could try something along the lines of
While ( Test-Path($yourFileDir) ){
Try{
remove-item [your item] -ErrorAction Stop
}catch{
Write-Verbose "File locked, trying again in 5"
Start-Sleep -seconds 5
}
}
Of course, you can tidy this up to fit your needs, adding an appropriate time out etc.
Upvotes: 3