Reputation: 1901
I have a script which is not working:
$files = Get-ChildItem "C:\test\"
for ($i=0; $i -lt $files.Count; $i++) {
$outfile = $files[$i].FullName + "out"
Get-Content $files[$i].FullName | ? {$_.Trim() -ne "" } | Set-Content $files[$i].FullName
}
Error is:
Set-Content : The process cannot access the file 'C:\test\ADMINISTRATOR.txt' because it is being use At D:\skriptablank.ps1:4 char:63 + Get-Content $files[$i].FullName | ? {$_.Trim() -ne "" } | Set-Content $files ... + ~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Set-Content], IOException + FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.SetContentCommand
What do you think, where I am doing it wrong?
Upvotes: 0
Views: 1830
Reputation: 200193
You're using Get-Content
and Set-Content
on the same file in a pipeline:
Get-Content $files[$i].FullName | ... | Set-Content $files[$i].FullName
When you do this, the file is usually still being read from when Set-Content
starts writing to it, thus causing the error you observed. You need to either finish reading before you start writing:
(Get-Content $files[$i].FullName) | ... | Set-Content $files[$i].FullName
or write to a temporary file first and then replace the original file with the temp file:
Get-Content $files[$i].FullName | ... | Set-Content "$env:TEMP\foo.txt"
Move-Item "$env:TEMP\foo.txt" $files[$i].FullName -Force
For small files you normally want to use the first approach, because reading the entire file into memory instead of reading it line-by-line is faster and easier to handle.
For large files you normally want to use the second approach in order to avoid memory exhaustion. Make sure to create the temp file on the same filesystem as the file you want to replace, so you don't have to copy the entire data again when moving the temp file (move operations within a filesystem/volume just need to change the reference to the file, which is significantly faster than shifting the file's content).
Upvotes: 2