Reputation: 73
I'm using the below powershell script to move files from $Source to $Destination based on a date range. During testing, it's moving all of files within a directory that may have been updated within the date range, but I only require the updated or new files with the Directory/Subdirectory. I like how it's working, except for copying everything.
Any thoughts on how the script can can be tweaked?
$source = "C:\Documents and Settings"
$destination = "C:\Documents and Settings\Test"
[datetime]$start = '2/18/2021 11:00:00'
[datetime]$end = '2/18/2021 11:15:00'
#Prod
dir $source | %{ dir $_.FullName | ?{ $_.LastWriteTime -gt $start-and $_.LastWriteTime -lt $end } |
Copy-Item -Destination $destination -Recurse -Force }
Upvotes: 2
Views: 10797
Reputation: 4694
In Posh, you filter as far left for faster results.
$source = "C:\Documents and Settings"
$destination = "C:\Documents and Settings\Test"
[datetime]$start = '2/18/2021 11:00:00'
[datetime]$end = '2/18/2021 11:15:00'
Get-ChildItem -Path $source -Recurse | Where-Object { $_.lastwriteTime -le $start -lt $end } | ForEach-Object -Process { Copy-Item -Path $_.FullName -Destination $destination -Force }
You're doing redundant work when you pipe another dir
down the line.
Upvotes: 1
Reputation: 4301
I think what you probably need to do, if you're staying in Powershell and not using something like Robocopy to sync the folders, would be to push the path recursion up a level and then check a file at a time:
Get-ChildItem -Recurse
on the source pathForEach-Object
on the resultTest-Path
on the destination (doing a Replace
of $source
to $destination
in the path)Copy-Item
from $source
to $destination
where the file doesn't exist or the LastFileWrite
property is newer at the source pathUpvotes: 2