Bite code
Bite code

Reputation: 596703

How to start and share one and only one separate process for several processes in Python?

I know how to use multiprocessing to start a separate process and make concurrent data processing.

What I like to know now, is, if I have several main processes (let's say several wsgi processes), how can I create only one separate process for all the main processes ?

Each main process should be able to communicate with the separate process using a queue, but each are started separatly, in a different Python VM.

E.G:

If one process notice the process has died, it can start it again.

Is that possible ? and how ?

If yes suppose it must involve using a PID file.

Upvotes: 2

Views: 325

Answers (1)

aychedee
aychedee

Reputation: 25569

Yes that is possible. You could use a pidfile or (if you are using Linux) you can use a socket like this: https://stackoverflow.com/a/7758075/639295.

You might also look at using something like redis to do the inter process communication. It has a simple Python api that makes it easy to have multiple processes listening to another.

This is an example of both sides, methods on larger classes.

def redis_listener(self):
    r = redis.Redis(host='localhost', db=0)
    pubsub = r.pubsub()
    pubsub.psubscribe('a.channel')
    for message in pubsub.listen():
        logging.info('Received message: %s' % (message,))
        self.parse_message(message)


def redis_broadcaster():
    r = redis.Redis(host='localhost', db=0)
    pubsub = r.pubsub()
    pubsub.psubscribe('a.*')
    for message in pubsub.listen():
        if message['pattern'] == None:
            continue
        symbol = message['pattern'].split('.')[1]
        for listener in WATCHERS[symbol]:
            listener.write_message(unicode(message['data']))

Upvotes: 1

Related Questions