Reputation: 31
Using powershell, I am creating jobs to run a command block on remote servers. The command scans folders looking for certain files and returns the contents via the output buffer. So - I poll the job every so often (say 10s) - and use receive-job to clear the output buffer. Even then, the memory consumed on the host machine continues to grow by megabytes (memory consumption on client stays low throughout). I don't see anything in other job buffers (error, warning, etc.). The total amount of data received via receive-job ends up being maybe 300k... but the memory consumption is around 400mb. When I remove-job and force garbage collection, the memory is released.
The consumption is gradual, and the growth rate is fairly constant as the job runs. (well - it jumps 3-6mb every so often...)
Invoke-Command ComputerName=$vweb -ScriptBlock {
param($domainsRoot, $filter, $newFileKey, $sitesList)
Get-ChildItem -Path $domainsRoot -ErrorAction SilentlyContinue | Where-Object {$sitesList.Contains($_.Name)} |
ForEach-Object -Process {
Get-ChildItem -Path (Join-Path (Join-Path $domainsRoot $_) "wwwroot/v/*") -filter $filter -recurse -ErrorAction SilentlyContinue |
ForEach-Object -Process {
Write-Output ($newFileKey + ($_.FullName.replace($domainsRoot,'')))
Write-Output (Get-Content -Path $_.FullName)
}
}
} -ArgumentList $domainsRoot, $filter, $global:newFileKey, $sitesList -AsJob -JobName $($vweb)
The command block basically scans a folder ($domainsRoot) looking for target folders (found in $sitesList array) - scans a subfolder of that (wwwroot/v/*) looking for files that match a filter ($filter). (the purpose is to scan iis websites on multiple servers for *.asp files)
Is it something in my command block - or am I misunderstanding how jobs work?
ps version: 5.1.19041.1320, windows 10 host/clients
Upvotes: 2
Views: 306
Reputation: 31
When you create a remote job, two jobs are created - one for the host and a child job (also on the host) for the remote job. When I used receive-job
on the parent, I expected this to clear out all output streams (parent and child). It turned out that the child job still had a fully populated field $childJob.output
.
I ended up using receive-job
on the child job, and then immediately cleared its output using $childJob.output.clear()
.
In my tests, this didn't have any adverse affects - but, I wouldn't completely trust this method for more critical tasks without better testing.
After I did this, the memory consumption problem was resolved.
Upvotes: 1