anarchy
anarchy

Reputation: 551

Running multiple tasks of one DAG in separate machines in Airflow

I need to create a dag which looks like this-

airflow dag snapshot

print_date task needs to run from a server A and templated task needs to run from server B. From the documentation it is clear that celery with Redis or RabbitMq will be required. I am using celery along with Redis(puckel/docker-airflow). I already have airflow running in server B with celery executer.
Do I need to have the same setup in server A as well ?? Also, how will I connect these two tasks in a single dag which are actually present in the different server? A sample framework for this kind of use case will be much appreciated.

Upvotes: 0

Views: 2201

Answers (1)

kaxil
kaxil

Reputation: 18844

Use Airflow Queues. And when you define your task add a queue parameter and assign it to a particular queue.

For example, queue1 would just run all the task on Machine 1 & queue2 would run all tasks on Machine 2.

So you can assign your task A to queue 1, hence it would run on Machine 1 and assign

task B to queue 2, hence it would run on Machine 2

Check documentation at https://airflow.apache.org/concepts.html#queues

Upvotes: 1

Related Questions