Reputation: 63
Run a job for each server in a list. I only want 5 jobs running at a time. When a job completes, it should start a new job on the next server on the list. Here's what I have so far, but I can't get it to start a new job after the first 5 jobs have ran:
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$Queue = New-Object System.Collections.Queue
$CurrentJobQueue = Get-Job -State Running
$JobQueueCount = $CurrentJobQueue.count
ForEach($Item in $list)
{
Write-Host "Adding $Item to queue"
$Queue.Enqueue($Item)
}
Function Global:Remote-Install
{
$Server = $queue.Dequeue()
$j = Start-Job -Name $Server -ScriptBlock{
If($JobQueueCount -gt 0)
{
Test-Connection $Server -Count 15
}##EndIf
}##EndScriptBlock
}
For($i = 0 ;$i -lt $MaxJobs; $i++)
{
Remote-Install
}
Upvotes: 3
Views: 10345
Reputation: 201832
PowerShell will do this for you if you use Invoke-Command
e.g.:
Invoke-Command -ComputerName $serverArray -ScriptBlock { .. script here ..} -ThrottleLimit 5 -AsJob
BTW I don't think your use of a .NET Queue is going to work because Start-Job fires up another PowerShell process to execute the job.
Upvotes: 4
Reputation: 42063
You may take a look at the cmdlet Split-Pipeline
of the module SplitPipeline.
The code will look like:
Import-Module SplitPipeline
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$list | Split-Pipeline -Count $MaxJobs -Load 1,1 {process{
# process an item from $list represented by $_
...
}}
-Count $MaxJobs
limits the number of parallel jobs. -Load 1,1
tells to pipe
exactly 1 item to each job.
The advantage of this approach is that the code itself is invoked synchronously
and it outputs results from jobs as if all was invoked sequentially (even
output order can be preserved with the switch Order
).
But this approach does not use remoting. The code works in the current PowerShell session in several runspaces.
Upvotes: 3