Reputation: 50
I have a question about running function parallel in Python.
I have tried using multi processing to reduce time sending and receiving data from API but when I execute code below, it tend to crash my IDE.
def network_request_function(value)
#this function sends requests using value.
for i in list:
p1 = Process(target=network_request_function, args=(i,))
p1.start()
Can you provide a way to fix my code? Or are there better alternatives?
Upvotes: 0
Views: 113
Reputation: 44013
You should specify what platform this is running on what your IDE is. Also, if all network_request_function
is doing is making a network request and awaiting a reply which gets no further processing requiring intensive CPU, then this seems like it should be using multithreading instead of multiprocessing and a multithreading pool where the number of concurrent threads can be limited in case the length of your input list is very large and where it is simpler to get a return value from network_request_function
that you might be interested in. And you should not use a name, such as list
, that happens to be the name of a built-in function or class for naming a variable.
For example:
def network_request_function(value):
#this function sends requests using value and returns the reply
return reply
if __name__ == '__main__': # Required if we switch to multiprocessing
# To use multiprocessing:
#from multiprocessing.pool import Pool as Executor
# To use multithreading:
from multiprocessing.pool import ThreadPool as Executor
# inputs is our list of value arguments used with network_request_function:
inputs = []; # This variable is set somewhere
# May need to be a smaller number if we are using multiprocessing and
# depending on the platform:
MAX_POOL_SIZE = 200
pool_size = min(len(inputs), MAX_POOL_SIZE)
with Executor(pool_size) as pool:
# Get list of all replies:
replies = pool.map(network_request_function, inputs)
Upvotes: 1