basht0p
basht0p

Reputation: 1

Script runs hundreds of times faster in ISE than in the shell. Why, and how do I fix it?

Myself and some other people at work have been trying to figure out exactly why this excerpt of this script runs so much faster in ISE than in the shell.

For context, the entire script (which compares AD hashes to a list of known compromised hashes), will run in ISE in about 30 minutes with the expected results. However, when invoked remotely or run locally from the shell, it takes up to 10 days in some cases.

We've found that this little bit of code in a function is where things go wonky. I'm not 100% certain, but I believe it may be resulting from the use of System.IO.StreamReader. Specifically, calling the ReadLine() method; but really not sure.

$fsHashDictionary = New-Object IO.Filestream $HashDictionary,'Open','Read','Read'
$frHashDictionary = New-Object System.IO.StreamReader($fsHashDictionary) 

while (($lineHashDictionary = $frHashDictionary.ReadLine()) -ne $null) {

if($htADNTHashes.ContainsKey($lineHashDictionary.Split(":")[0].ToUpper()))
{
    $foFoundObject = [PSCustomObject]@{
        User = $htADNTHashes[$lineHashDictionary.Split(":")[0].ToUpper()]
        Frequency = $lineHashDictionary.Split(":")[1]
        Hash = $linehashDictionary.Split(":")[0].ToUpper()
    }
    $mrMatchedResults += $foFoundObject            
}

Upvotes: 0

Views: 1299

Answers (2)

iRon
iRon

Reputation: 23763

Afaik, there isn't anything that can explain a "Script runs hundreds of times faster in ISE than in the shell" therefore I suspect the available memory differences between one and the other session are causing your script to run into performance issues.
Knowing that custom PowerShell objects are pretty heavy. To give you an idea how much memory they consume, try something like this:

$memBefore = (Get-Process -id $pid).WS
    $foFoundObject = [PSCustomObject]@{
        User = $htADNTHashes[$lineHashDictionary.Split(":")[0].ToUpper()]
        Frequency = $lineHashDictionary.Split(":")[1]
        Hash = $linehashDictionary.Split(":")[0].ToUpper()
    }
$memAfter = (Get-Process -id $pid).WS
$memAfter - $memBefore

Together with the fact that arrays (as $mrMatchedResults) are mutual and therefore causing the array to be rebuild every time you use the increase assignment operator (+=), the PowerShell session might be running out of physically memory causing Windows to constantly swapping memory pages.

.Net methods like [System.IO.StreamReader] are definitely a lot faster then PowerShell cmdlets (as e.g. Get-Content) but that doesn't mean that you have to pot everything into memory. Meaning, instead of assigning the results to $lineHashDictionary (which loads all lines into memory), stream each object to the next cmdlet.

Especially For you main object, try to respect the PowerShell pipeline. As recommended in Why should I avoid using the increase assignment operator (+=) to create a collection?, you better not assign the output at all but pass the pipeline output directly to the next cmdlet (and eventually release to its destination, as e.g. display, AD, disk) to free up memory.

And if you do use .Net classes (along with the StreamReader class) make sure that you dispose the object as shown in the PowerShell scripting performance considerations article, otherwise you function might leak even more memory than required.

the performance of a complete (PowerShell) solution is supposed to be better than the sum of its parts. Meaning, don't focus too much on a single function if it concerns performance issues, instead look at you whole solution. The PowerShell pipeline gives you the opportunity to e.g. load objects from AD and process them almost simultaneously and using just a little more memory than each single object.

Upvotes: 1

alexzelaya
alexzelaya

Reputation: 255

It's probably because ISE uses the WPF framework and benefits from hardware acceleration, a PowerShell console does not.

Upvotes: 1

Related Questions