PointerNullException
PointerNullException

Reputation: 199

Get-Childitem Workaround

I want to determine which files/filetypes consume most space on the fileserver. Due to the fact that alot of users created files with a name/path length over 260 char gci does not work.(shortcuts-in-shortcuts-in-shortcuts-infinity)

My next step was to create a workaround by using something that displays, size file name and path without max_path var limitation.(ROBOCOPY)

Here is my function:

Function Get-RoboFileSize($source_name){

$filelog=robocopy.exe /e /l /njh /njs /ndl /fp /nc /bytes $source_name $env:Temp

foreach ($item in $filelog){

   if ($item -like "*$source_name*"){   

         $base=$item -Replace "[\s]",""
         $Filename=$base -Replace "^[0-9]{1,}",""
         $Filesize=$base -Replace "[^\d][a-zA-Z0-9~@#\^\$&\*\(\)-_\+=\[\]\{\}\|\\,\.\?\!\%\§öäüÖÄÜßáàðÞ]{1,}",""

         New-Object PSObject -Property @{
                                            Filename=$Filename
                                            FileSize= ("{0:#}" -f($Filesize/1MB))

         }              
    }

}

}

This approach works but my problem is that it consumes alot of resources.

Does someone has an idea how to improve this function....maybe an idea for another workaround?

Upvotes: 1

Views: 1774

Answers (1)

Victor Zakharov
Victor Zakharov

Reputation: 26444

Microsoft knows about path length limitations.

There is an article that provides a workaround in C#. If you really care about performance, this is your best bet: Long Paths in .NET, Part 1 of 3 [Kim Hamilton]

If you want to stick with Powershell, see another workaround on powershell.com forums.

Upvotes: 3

Related Questions