Dejell
Dejell

Reputation: 14337

Using Django in a multithreading code

I am running a python script using django ORM to handle calls to the database.

I have the following code:

with ThreadPoolExecutor(max_workers=2) as executor:
    executor.submit(process, 2)
    executor.submit(process, 1)

@transaction.atomic
def process(counter)
    MyModel.objects.filter(user_id=counter).delete()
    users = <create User object)
    MyModel.objects.bulk_create(users)

I am using django default behavior which sets the auto-commit to True.

When I debug the code, it looks that django uses the same connection among the application, and therefore, the last process that will exit process method, will commit all the transactions that are using the connection.

How can I prevent it to happen? I would like django to close (and commit) the connection at the end of each method so each thread could be handled separately.

Upvotes: 0

Views: 441

Answers (1)

Bojan Kogoj
Bojan Kogoj

Reputation: 5649

I suggest using Celery. You will need Redis or RabbitMQ, to send messages between Django and Celery.

@app.task
def process(counter):
    MyModel.objects.filter(user_id=counter).delete()
    users = <create User object)
    MyModel.objects.bulk_create(users)

After that, use apply_async

process.apply_async((2,))
process.apply_async((1,))

Setting this up is a bit more complicated, but works well.

Upvotes: 1

Related Questions