Reputation: 371
Im using concurrent.futures library to do a for-loop with multithreading. It needs to do the for loop every time with all 5 parameters. Now i have reached the point that my do_something_parallel
-function only prints "test1" and nothing more.
The problem now is that inside the do_something_parallel
-function it does not recognize item. Because when i print the error it says AttributeError: <unknown>.Name
. Inside the for-loop i also tried to print item.Name
and there it works.
from concurrent.futures import ThreadPoolExecutor
do_something_parallel(x, par2, par3, par4, par5):
print("test1")
print(str(x.Value))
print("test2")
main():
for i in range(0,38):
with ThreadPoolExecutor(max_workers=4) as executor:
futures = set()
for x in range(0,5):
print(str(item.Name)
f = executor.submit(do_something_parallel, x, par2, par3, par4, par5)
futures.add(f)
Upvotes: 0
Views: 16397
Reputation: 3506
You have to take ThreadPoolExecutor outside of the iterator, and then the pattern will be like this:
from concurrent import futures
from concurrent.futures import ThreadPoolExecutor
def do_something(*args, **kwargs):
""" Stub function to use with futures - your processing logic """
print("Do something in parallel")
return "result processed"
def main():
# The important part - concurrent futures
# - set number of workers as the number of jobs to process
# The number of workers you want to run in parallel
workers_range = 3
with ThreadPoolExecutor(len(your_range)) as executor:
# Use list jobs for concurrent futures
# Use list scraped_results for results
jobs = []
results_done = []
# Here you identify how many parallel tasks you want
# and what value you'll send to them
values = ["value1", "value2", "value3"] # as per workers_range
for value in values:
# Pass some keyword arguments if needed - per job
kw = {"some_param": value}
# Here we iterate 'number of dates' times, could be different
# We're adding scrape_func, could be different function per call
jobs.append(executor.submit(do_something, **kw))
# Once parallell processing is complete, iterate over results
for job in futures.as_completed(jobs):
# Read result from future
result_done = job.result()
# Append to the list of results
results_done.append(result_done)
# Iterate over results scraped and do whatever is needed
for result in results_done:
print("Do something with me {}".format(result))
Just follow that pattern to get it working.
Upvotes: 4