Dragos Losonti
Dragos Losonti

Reputation: 11

Can I stop waiting for threads to finish if one of them produced results?

Im making a bunch of GET requests to about a few hundred different API endpoints on different servers. In one of these endpoints there is some information that i want to fetch and return.

After any of these requests return something to me, i want to terminate the other threads and exit. Some requests are almost instant, some can take up to 20 seconds to finish.

If i happen to find the info in 2 seconds, i don't want for 20 seconds to elapse before i can resume work.

Currently I'm doing things like this:

threads = list()
for s in silos: #here i create all the requests
    t = Thread(target=process_request, args=(my, args, here))
    t.name = "{} - {}".format(some, name)
    threads.append(t)

Then I do:

print("Threads: {}".format(len(threads))) # 100 - 250 of them
    [ t.start() for t in threads ]
    [ t.join() for t in threads ]

process_request() simply makes the get request and stores the result inside a dict if the status_code == 200. I'm using the requests and threading modules.

Upvotes: 1

Views: 40

Answers (1)

Anmol Singh Jaggi
Anmol Singh Jaggi

Reputation: 8576

If you use the multiprocess pool, then you can terminate the pool as soon as the first response arrives:

import multiprocessing as mp
import time


pool = None


def make_get_request(inputs):
    print('Making get request with inputs ' + str(inputs))
    time.sleep(2)
    return 'dummy response for inputs ' + str(inputs)


def log_response(response):
    print("Got response = " + response)
    pool.terminate()

def main():
    global pool
    pool = mp.Pool()
    for i in range(10):
        pool.apply_async(make_get_request, args = (i,), callback = log_response)
    pool.close()
    pool.join()

if __name__ == '__main__':
    main()

Upvotes: 1

Related Questions