Reputation: 13
I have a Django management command that I can run so: "python manage.py updateprices" and performs this query:
activos = SearchTag.objects.filter(is_active=True)
I would like to split commands like this over smaller chunks of my database
e.g. activos[0:2000]
at the same time, so I speed up the process
How can I do that?
Upvotes: 1
Views: 990
Reputation: 9685
If I understand correctly I think you are talking about doing DB queries by groups.
So first off, you should create a list (iterable)
q_list = []
active_q = SearchTag.objects.filter(is_active=True)
# start from 0 and make jumps of 2,000. so if you have 20,000 rows you would have a list containing 10 querysets of 2000 rows each
for i in range(0, active_q.count(), 2000):
q_list.append(active_q[i:i+2000])
Now that you have the list of items you want to do operations on you can easly pass it to pool
from multiprocessing import cpu_count, Pool
pool = Pool(processes=cpu_count())
pool.map(function_on_each_q, q_list)
obviously you would need to create the function to do whatever it is you want:
def function_on_each_q(q_list):
# DO WHATEVER WITH q_list
Upvotes: 2