Kruczkowski Piotr
Kruczkowski Piotr

Reputation: 171

Python: How to call method in separate process

I want to start the ActorCore method in a seperte process and then process messages that come to that ActorCore. For some reason this code is not working.

import queue
from multiprocessing import Process


class NotMessage(Exception):
    def __str__(self):
        return 'NotMessage exception'


class Message(object):

    def Do(self, Actor):
        # Do some stuff to the actor
        pass

    def __str__(self):
        return 'Generic message'


class StopMessage(Message):

    def Do(self, Actor):
        Actor.__stopped = True

    def __str__(self):
        return 'Stop message'


class Actor(object):
    __DebugName = ''
    __MsgQ = None
    __stopped = False

    def __init__(self, Name):
        self.__DebugName = Name
        self.__MsgQ = queue.Queue()

    def LaunchActor(self):
        p = Process(target=self.ActorCore)
        p.start()
        return self.__MsgQ

    def ActorCore(self):
        while not self.__stopped:
            Msg = self.__MsgQ.get(block=True)
            try:
                Msg.Do(self)
                print(Msg)
            except NotMessage as e:
                print(str(e), ' occurred in ', self.__DebugName)


def main():
    joe = Actor('Joe')
    msg = Message()
    stop = StopMessage()
    qToJoe = joe.LaunchActor()
    qToJoe.put(msg)
    qToJoe.put(msg)
    qToJoe.put(stop)

if __name__ == '__main__':
    main()

I am getting weird error when running:

Traceback (most recent call last):
  File "C:/Users/plkruczp/PycharmProjects/ActorFramework/Actor/Actor.py", line 64, in <module>
    main()
  File "C:/Users/plkruczp/PycharmProjects/ActorFramework/Actor/Actor.py", line 58, in main
    qToJoe = joe.LaunchActor()
  File "C:/Users/plkruczp/PycharmProjects/ActorFramework/Actor/Actor.py", line 40, in LaunchActor
    p.start()
  File "C:\Program Files\Python35\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Program Files\Python35\lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Program Files\Python35\lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Program Files\Python35\lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Program Files\Python35\lib\multiprocessing\reduction.py", line 59, in dump
    ForkingPickler(file, protocol).dump(obj)
TypeError: can't pickle _thread.lock objects

Help please! I tried everything :(

Upvotes: 4

Views: 3916

Answers (1)

bluesceada
bluesceada

Reputation: 131

Just use Queue instead of queue:

Remove import queue and add Queue to from multiprocessing like:

from multiprocessing import Process,Queue

then change self.__MsgQ = queue.Queue() to self.__MsgQ = Queue()

That's all you need to do to get it to work, the rest is the same for your case.

Edit, explanation:

queue.Queue is only thread-safe, and multiprocessing does actually spawn another process. Because of that, the additional multiprocessing.Queue is implemented to be also process-safe. As another option, if multithreading is wanted, the threading library can be used together with queue.Queue: https://docs.python.org/dev/library/threading.html#module-threading

Additional information:

Another parallelization option, depending on your further requirements is joblib, where the spawning can be defined to be either a process or a thread: https://joblib.readthedocs.io/

Upvotes: 3

Related Questions