Reputation: 24758
I have this:
#!/usr/bin/env python
import multiprocessing
class MultiprocessingTest(object):
def __init__(self):
self.cmd = ''
def for_process_A(self):
self.cmd = "AA"
print "%s executing and cmd is %s" % (multiprocessing.current_process().name, self.cmd)
def for_process_B(self):
self.cmd = "BB"
print "%s executing and cmd is %s" % (multiprocessing.current_process().name, self.cmd)
if __name__ == '__main__':
obj = MultiprocessingTest()
process_A = multiprocessing.Process(target=obj.for_process_A, name='process_A')
process_B = multiprocessing.Process(target=obj.for_process_B, name='process_B')
process_A.start()
process_B.start()
process_A.join()
process_B.join()
Question:
Do the two processes share the variable cmd
?
Do both processes have a separate class MultiprocessingTest
definition and work off of that?
Independent copies of which data exists in the two processes?
I am trying to understand from a theoretical standpoint what is actually happening here. Can you please comment on that?
Test Run o/p:
$ ./commonvar.py
process_A executing and cmd is AA
process_B executing and cmd is BB
Upvotes: 0
Views: 137
Reputation: 199
Processes don't share data. Each process is a separate container with following resources, generally speaking:
Processes interact with outside world through Pipes.
So to answer your questions:
cmd
variable.Further Explanation:
Behind the scenes, fork system call is used to create a process (assuming you are using *nix). Processes are heavier compared to threads because of the overhead involved in switching the conext.
Upvotes: 1
Reputation: 309919
Changes which happen inside of multiprocessing shouldn't propagate back to the calling "thread" (or any of the other multiprocessing processes). If you want that sort of "shared-memory-like" behavior, you'll need to look into using a multiprocessing.Manager
.
Upvotes: 0