Reputation: 243
I have a docker-compose consisting of four containers, all of which perform a single function:
An nginx proxy that forwards UI and API requests to the corresponding containers (node container, flask container), as depicted in the image below.
There is also a separate container which executes long running python scripts and works independent of the other containers. I'd now like to create the ability to execute scripts in the "long running scripts" (LRS) container via the API:
What is the best way to do this?
I've seen a few other questions that are somewhat similar to this, but raise more questions than they answer. Amongst the suggestions I've seen are:
None of this solutions seem ideal. Am I missing something? How do I achieve the intended functionality?
Upvotes: 1
Views: 269
Reputation: 9446
Generally the solution to run long-running scripts has been a pub-sub model. Your API would drop a message onto an execution Message-Queue. The worker instance would subscribe to that queue, and when messages appear, would execute your long-running script/query/etc. When the execution is complete, either a message will go back on a different queue, or results will be placed in a predetermined location (url).
This has a couple of advantages:
Upvotes: 2