Jin Yong
Jin Yong

Reputation: 43778

How to speed up Get-ChildItem in Powershell

Just wandering how could I speed up the Get-ChildItem in Powershell?

I have the following script to search for a file that created by today and copy it over to another folder.

$fromDirectory = "test_file_*.txt"
$fromDirectory = "c:\sour\"
$toDirectory = "c:\test\"

Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} | Copy-Item -Destination $toDirectory

Due to the folder that I search have 124,553 history files, it's take me age for the search. Does any know how could I improve my script to speed up my search and copy?

Upvotes: 1

Views: 2197

Answers (3)

Neil
Neil

Reputation: 1

I would try putting the Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} in an Array and then copying results from the Array.

$GrabFiles =@()
$GrabFiles =@( Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} )
Copy-Item -Path $GrabFiles -Destination $toDirectory }

Upvotes: 0

vonPryz
vonPryz

Reputation: 24071

This is a well-known feature of NTFS. Microsoft's docs say that the limit for decreasing performance is about 50 000 files in a directory.

If the file names are of very similar, creation of 8dot3 legacy names will start to slow down when there are about 300 000 files. Though you have "only" 120 k files, it's the same order of magnitude.

Some previous questions discuss this issue. Sadly, there is no a single good solution but better hierarchy in directories. The usual tricks are to disable 8dot3 with fsutil and last access date via registry, but those will help only so much.

Can you redesign the directory structure? Moving old files into, say, year-quarter subdirs might keep the main directory clean enough. To find out file's year-quarter, a quick way is like so,

gci | % { 
  $("{2} => {1}\Q{0:00}" -f [Math]::ceiling( ($_.LastAccessTime.toString('MM'))/3),
    $_.LastAccessTime.ToString('yyyy'), 
    $_.Name 
  )
}

Upvotes: 1

adamt8
adamt8

Reputation: 357

Here are some things to try:

First, use Measure-Command {} to get the actual performance:

Measure-Command { Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} | Copy-Item -Destination $toDirectory }

Then, consider removing the '-Recurse' flag, because this is actually going inside every directory and child and child of child. If your target log files are really that scattered, then...

Try using robocopy to match a pattern in the filename and lastwritetime, then use powershell to copy over. You could even use robocopy to do the copying.

It's possible that you just have a huge, slow problem to solve, but try these to see if you can break it down.

Upvotes: 2

Related Questions