nidHi
nidHi

Reputation: 833

Running an ssh connection as a daemon

I am trying to monitor gerrit events through an ssh command. The command is as follows.

ssh -p 29418 review.example.com gerrit stream-events

I need to monitor various such gerrit instances running on different ports and perform further analysis on the events received from these gerrit instances. I want to do it through code using python. I considered running these various ssh connections as multiple processes, for this I used the multiprocessing python package an tried to use the daemon attribute to run the process as a daemon. Below is a snippet of my code.

import multiprocessing as mp

class MyProcess(mp.Process):
    def __init__(self, target, args):
        mp.Process.__init__(self, target=target, args=args)

while True:
    running = get_runnig_instances() #get all the running gerrit instances
    for instance in running:
        port_num = instance.port
        url = instance.ip
        proc = MyProcess(target=client_setup, args=(url,port_num,)) #client_setup(url,port_num) is a function that runs the command for the ssh connection to stream gerrit events
        proc.daemon = True
        proc.start()
        proc.join()

This did not work as after running the ssh command for the first gerrit instance the control does not come back to the above piece of code from the function client_setup and stalls there establishing the connection and waiting to capture any possible events. Thus, any other running gerrit instance's events are not captured.

Also, the get_running_instances function each time returns a different set of running gerrit instances. Thereby, I cannot create the processes all at once.

I also tried to use the daemon package to achieve the same. Below is the snippet of the code.

import multiprocessing as mp
import daemon

class MyProcess(mp.Process):
    def __init__(self, target, args):
        mp.Process.__init__(self, target=target, args=args)

while True:
    running = get_runnig_instances() #get all the running gerrit instances
    for instance in running:
        port_num = instance.port
        url = instance.ip
        with daemon.DaemonContext():
            proc = MyProcess(target=client_setup, args=(url,port_num,)) #client_setup(url,port_num) is a function that runs the command for the ssh connection to stream gerrit events
            proc.start()
            proc.join()

I faced the same problem here. How do I go about this? I'm not sure what I'm doing wrong and need help.

And sorry for the extensive explanation.

Upvotes: 0

Views: 462

Answers (2)

viraptor
viraptor

Reputation: 34205

You're running:

proc.start()
proc.join()

The first one starts your new process. The second one waits for it to finish. Because you want to stream events, you want to leave that process running, not wait for it.

You can get rid of the main issue by just removing proc.join(). (or moving it to the cleanup code)

Upvotes: 0

parity3
parity3

Reputation: 703

I think multiprocessing is designed primarily for spawning and communicating with other python interpreters in a pythonic way. You may need to use the lower-level subprocess module instead, with the Popen() method. You can help get rid of the annoying parallel stream buffer consumption by using a third party library like plumbum, or even Twisted. I have less knowledge of the Python 3 asyncio subprocess methodology under "Interprocess Communication and Networking" but that may make things easier too, if you are running Python 3.

Yet another other option is to try running an ssh client natively in python, for which paramiko is used. I don't think it has built-in async support for multiple connections so you may have to plumb that all yourself.

Upvotes: 1

Related Questions