Jakub Turcovsky
Jakub Turcovsky

Reputation: 2126

Python - How to pass global variable to multiprocessing.Process?

I need to terminate some processes after a while, so I've used sleeping another process for the waiting. But the new process doesn't have access to global variables from the main process I guess. How could I solve it please?

Code:

import os
from subprocess import Popen, PIPE
import time
import multiprocessing


log_file = open('stdout.log', 'a')
log_file.flush()

err_file = open('stderr.log', 'a')
err_file.flush()

processes = []


def processing():
    print "processing"
    global processes
    global log_file
    global err_file

    for i in range(0, 5):
        p = Popen(['java', '-jar', 'C:\\Users\\two\\Documents\\test.jar'], stdout=log_file, stderr=err_file) # something long running
        processes.append(p)
    print len(processes)    # returns 5

def waiting_service():
    name = multiprocessing.current_process().name
    print name, 'Starting'
    global processes
    print len(processes)    # returns 0

    time.sleep(2)

    for i in range(0, 5):
        processes[i].terminate()
    print name, 'Exiting'

if __name__ == '__main__':
    processing()

    service = multiprocessing.Process(name='waiting_service', target=waiting_service)

    service.start()

Upvotes: 0

Views: 4002

Answers (2)

Jakub Turcovsky
Jakub Turcovsky

Reputation: 2126

The whole problem was in Windows' Python. Python for Windows is blocking global variables to be seen in functions. I've switched to linux and my script works OK.

Special thanks to @rchang for his comment:

When I tested it, in both cases the print statement came up with 5. Perhaps we have a version mismatch in some way? I tested it with Python 2.7.6 on Linux kernel 3.13.0 (Mint distribution).

Upvotes: 1

Reut Sharabani
Reut Sharabani

Reputation: 31339

You should be using synchronization primitives.

Possibly you want to set an Event that's triggered after a while by the main (parent) process.

You may also want to wait for the processes to actually complete and join them (like you would a thread).

If you have many similar tasks, you can use a processing pool like multiprocessing.Pool.

Here is a small example of how it's done:

import multiprocessing
import time

kill_event = multiprocessing.Event()

def work(_id):
    while not kill_event.is_set():
        print "%d is doing stuff" % _id
        time.sleep(1)
    print "%d quit" % _id

def spawn_processes():
    processes = []

    # spawn 10 processes
    for i in xrange(10):
        # spawn process
        process = multiprocessing.Process(target=work, args=(i,))
        processes.append(process)
        process.start()
        time.sleep(1)

    # kill all processes by setting the kill event
    kill_event.set()

    # wait for all processes to complete
    for process in processes:
        process.join()

    print "done!"
spawn_processes()

Upvotes: 2

Related Questions