Jonline
Jonline

Reputation: 1747

Multiprocessing errors in OS X with python2.7 on pre-El Capitan machines

The context for this is much, much too big for an SO question so the code below is a extremely simplified demonstration of the actual implementation.

Generally, I've written an extensive module for academic contexts that launches a subprocess at runtime to be used for event scheduling. When a script or program using this module closes on pre-El Capitan machines my efforts to join the child process fail, as do my last-ditch efforts to just kill the process; OS X gives a "Python unexpectedly quit" error and the the orphaned process persists. I am very much a nub to multiprocessing, without a CS background; diagnosing this is beyond me.

If I am just too ignorant, I'm more than willing to go RTFM; specific directions welcome.

I'm pretty sure this example is coherent & representative, but, know that the actual project works flawlessly on El Capitan, works during runtime on everything else, but consistently crashes as described when quitting. I've tested it with absurd time-out values (30 sec+); always the same result.

One last note: I started this with python's default multiprocessing libraries, then switched to billiard as a dev friend suggested it might run smoother. To date, I've not experienced any difference.

UPDATE: Had omitted the function that gives the @threaded decorator purpose; now present in code.

Generally, we have:

shared_queue = billiard.Queue()  # or multiprocessing, have used both

class MainInstanceParent(object):
     def __init__(self):
        # ..typically init stuff..
        self.event_ob = EventClass(self) # gets a reference to parent

     def quit():
         try:
            self.event_ob.send("kkbai")
            started = time.time()
            while time.time - started < 1:  # or whatever
                self.event_ob.recieve()
            if self.event_ob.event_p.is_alive():
                raise RuntimeError("Little bugger still kickin'")
          except RuntimeError:
              os.kill(self.event_on.event_p.pid, SIGKILL)

class EventClass(object):
    def __init__(self, parent):
        # moar init stuff
        self.parent = parent
        self.pipe, child = Pipe()
        self.event_p = __event_process(child)

    def receive():
        self.pipe.poll()
        t = self.pipe.recv()
        if isinstance(t, Exception):
            raise t
        return t

    def send(deets):
        self.pipe.send(deets)

def threaded(func):
    def threaded_func(*args, **kwargs):
        p = billiard.Process(target=func, args=args, kwargs=kwargs)
        p.start()
        return p
    return threaded_func    

@threaded
def __event_process(pipe):
    while True:
        if pipe.poll():
            inc = pipe.recv()
            # do stuff conditionally on what comes through
            if inc == "kkbai":
                return
            if inc == "meets complex condition to pass here":
                 shared_queue.put("stuff inferred from inc")

Upvotes: 0

Views: 140

Answers (1)

Roland Smith
Roland Smith

Reputation: 43523

Before exiting the main program, call multiprocessing.active_children() to see how many child processes are still running. This will also join the processes that have already quit.

If you would need to signal the children that it's time to quit, create a multiprocessing.Event before starting the child processes. Give it a meaningful name like children_exit. The child processes should regularly call children_exit.is_set() to see if it is time for them to quit. In the main program you call children_exit.set() to signal the child processes.

Update:

Have a good look through the Programming guidelines in the multiprocessing documentation;

  • It is best to provide the abovementioned Event objects as argument to the target of the Process initializer for reasons mentioned in those guidelines.

  • If your code also needs to run on ms-windows, you have to jump through some extra hoop, since that OS doesn't do fork().

Update 2:

On your PyEval_SaveThread error; could you modify your question to show the complete trace or alternatively could you post it somewhere? Since multiprocessing uses threads internally, this is probably the culprit, unless you are also using threads somewhere.

If you also use threads note that GUI toolkits in general and tkinter in particular are not thread safe. Tkinter calls should therefore only be made from one thread!

How much work would it be to port your code to Python 3? If it is a bug in Python 2.7, it might be already fixed in the current (as of now) Python 3.5.1.

Upvotes: 1

Related Questions