Reputation: 33
I have a script that will recursively search through a network path and pull the filenames. However the first 5 folders need to be excluded from the search because, well, it's deprecated and they haven't removed them yet (I don't have access to remove the folders).
What I need to do is exclude some folders called: - BOOT MEDIA
, 1_Do_not_use
, 1_lync
, and a couple more.
How can I recursively search the directory and exclude the correct folders from the search?
What I have so far:
param(
[string]$gather = "PACKAGE"
)
function say($info){ Write-Host $info }
function searchDir($path = "<removed for security>")
{
foreach ($item in Get-ChildItem $path)
{
$package = Test-Path $item.FullName -PathType Container
if ($package)
{
$item
searchDir $item.FullName
}
else
{
$item
}
}
}
Image of the directory:
Image to display the output given from answer:
Upvotes: 0
Views: 556
Reputation: 73606
Excluding deprecated folders that may exist anywhere recursively, PS2.0+:
function searchDir($path = "<removed for security>")
{
$deprecated = '- BOOT MEDIA', '1_Do_not_use', '1_lync'
Get-ChildItem $path | ForEach {
if ($_ -is [IO.FileInfo]) {
$_
} elseif ($deprecated -notcontains $_.Name) {
$_
searchDir $_.FullName
}
}
}
And a much much faster .NET4+ version that outputs only the full file names, not objects.
The directories have \
at the end so that it's easy to discern.
function searchDir($path = "<removed for security>")
{
$deprecated = '- BOOT MEDIA', '1_Do_not_use', '1_lync', 'default'
[IO.Directory]::EnumerateFiles($path)
[IO.Directory]::EnumerateDirectories($path) | ForEach {
if ($deprecated -notcontains $_.Name) {
"$_\"
searchDir $_
}
}
}
Another approach would be to enumerate everything with just one Get-ChildItem and filter the deprecated folders from the result afterwards, but since the network access is slow, I'm avoiding unnecessary enumeration in favor of the self-recursive function also used by the OP.
Important note.
Now the function is pipeline-friendly so that its result can be piped and processed immediately.
Whereas OP's function stopped further execution of a callee until all files were fetched because foreach
statement accumulates the entire array before enumerating it. With slow network or large directory tree this can take minutes easily.
Upvotes: 1