Aaron Robinson
Aaron Robinson

Reputation: 416

Python Multiprocessing Queue on Parent Exit

The gist of my question is what happens to a multiprocessing queue when the parent (a daemon in this circumstance) is killed.

I have a daemon that runs in the background which queues up jobs for child processes:

class manager(Daemon):
    def run(self):
        someQueue = MP.Queue()

        someChild = MP.Process(target=someCode, args=(someArgs))
        someChild.start()
        ...

If the manager is killed (assuming it wasn't trying to use someQueue and therefore corrupted it as mentioned in the documentation), is there anyway to recover the data in the Queue?

Two theoretical solutions I see are cleaning up the someQueue in someChild before exiting this child process. Also dumping the queues so that I could restore the state of the queues when the manager exited would also solve my problem. However, before implementing either it would be nice to get nudged in the right direction.

Thanks,

Upvotes: 3

Views: 1001

Answers (1)

ColoradoEric
ColoradoEric

Reputation: 71

It sounds like you want persistent/reliable queuing. I believe the multiprocessing.Queue class is implemented with pipes (just like you would get with a popen() call), so the data is relatively transient and you'd probably have to do some OS-level trickery to grab the contents. You might look into writing your own persistent queue class that uses a filesystem file (assuming your OS and filesystem support locking) to store the queue contents. Then you can provide all of the analysis tools you desire to inspect the queue and recover unprocessed data.

Upvotes: 1

Related Questions