JahMyst
JahMyst

Reputation: 1686

Python - Fabric maximum 10 parallel SSH connections

I am using Fabric with the parallel decorator like so:

parallel(pool_size=100)
def mytask():
    # do work

I was hoping the program to open 100 distinct SSH connections and run the Fabric task on all those servers in parallel.

However, monitoring the number or open SSH connections always gives me an average of 10. I am running on a powerful enough CentOS instance.

I am getting the number of concurrent outgoing SSH connections with:

sudo netstat -atp | grep "/python"  | grep 10. | grep ssh | wc -l

I tried to increase MaxSessions and MaxStartups in /etc/ssh/sshd_config but I might not have understood those settings (I am feeling these are setting limits on incoming SSH connections instead of outgoing).

Is there a system limit that I need to increase to be able have more than 10 open SSH connections ?

Related (no answers): python fabric parallel pool restrictions

Upvotes: 3

Views: 1165

Answers (1)

jsbueno
jsbueno

Reputation: 110516

The get_pool_size method in the fabric.tasks.Task class is a bit convoluted, trying to guess a not too large pool_size. It returns an integer, after picking values from the global config, task config, default passed, number of hosts.

By my reading of it, it should return the minimum of the number_of_hosts and the value you configure in your parallel decorator.

Maybe you could just "brute-force" patch that method prior to running the task - Maybe Python's "unittest.mock.patch" decorator can do a prettier job out of this - but it is somewhat complex, and I have no idea how it would interact with the parallel decorator itself.

So, just monkey patch get_pool_size to return 100 at the beginning of your file, and it should work:

import fabric.tasks

fabric.tasks.Task.get_pool_size = lambda self: 100

...

Upvotes: 2

Related Questions