Reputation: 15
I have a function that encrypts a number and stores it in an list
encrypted = [[0]*10]*1000
def encrypt(i):
encrypted[i]=bin(i)[2:].zfill(10).decode('hex')
The expression is much more complex than this. I am just stating an example.
Now I want to call the encrypt function inside a for loop with multiple calls in different processes or threads - however due to GIL for CPU bound process, threads wont help - correct me if i am wrong.
for i in xrange(1000):
encrypt(i)
So The loop should not wait for the encryption of one value to get over, for the next to start.
So when i=1 and encryption of 1 is taking place, For loop should increment and start encrypting 2, and then 3 simultaneously.
The results of encryption should be stored in encrypted list (order of results is not important).
Upvotes: 0
Views: 2035
Reputation: 29967
You can use multithreading.Pool
from multiprocessing import Pool
def encrypt(i):
return bin(i)[2:].zfill(10).decode('hex')
if __name__ == '__main__':
pool = Pool(processes=4) # adjust to number of cores
result = pool.map(encrypt, range(1000))
print result
Upvotes: 1
Reputation: 116
Alright, first some advice. Depending on the number of threads you need to run you should check out PyPy this sounds like the kind of project that could benefit heavily from pypy's features.
Here Is an edited example from the Queue docs if I understand what you need than this should point you in the right direction.
This code assumes that you have a list of encrypted numbers and that your encrypt function handles adding the results to a list or storing them somehow.
def worker():
while True:
number = q.get()
encrypt(number)
q.task_done()
q = Queue()
for i in range(num_worker_threads):
t = Thread(target=worker)
t.daemon = True
t.start()
for number in numbers:
q.put(number)
q.join() # block until all tasks are done
Upvotes: 1