Thisisstackoverflow
Thisisstackoverflow

Reputation: 261

How to implement a queue system that is shared between separate python processes?

We have a python script that is kicked off after a user enters some data in a web form. The entire script takes 10-20 minutes to complete. A user can technically kick off 10 of these within 10 seconds if they so chose. If this happens, we'll have 10 of the same python scripts running at once and causing each other to fail due to various things the script is processing.

What is the go-to way to code an overarching queueing system so that these scripts know of each other and will wait in line to execute? We are people who usually write one off scripts but need to have this constant queueing system in place for the web form... sadly we aren't developers or have that background.

We're also open to different ways of architecting the solution in general. We went into this hacking it together. We've never made a process/service broker/worker but would that might make sense here?

How do people normally do this stuff?

Upvotes: 2

Views: 883

Answers (1)

matthew.
matthew.

Reputation: 176

Welcome to the wild world of distributing your computation!

Implementing your own queuing system (even in Python) can lead to a world of hurt. A very popular, enterprise-grade and open source message queuing application is RabbitMQ. They have a great starter tutorial that talks about ways you can begin configuring it and examples of its uses.

Additionally there is a Python task queue library called Celery that consequently uses RabbitMQ under the hood. It is a bit smaller in focus and capability but offers easy of use and a faster start up time as a tradeoff. One thing that it does not trade-off is RabbitMQs consistency which as you delve deeper into queuing and distributed systems, you will learn is extremely important. There getting started docs can be found here

Links:

Upvotes: 3

Related Questions