Reputation: 622
My goal is to easily export events from Airflow into other systems. One option is to create a plugin that can access Airflow internal state and expose it via Rest API (there are some existing implementations), but what I'm concerned more about is whether it would be possible to plug into Airflow's event log and stream those message to an external message queue (e.g. Kafka, PubSub, Kinesis).
Upvotes: 0
Views: 651
Reputation: 4366
The easiest way I could imagine accomplishing this is by using the sqlalchemy.event.listens_for
decorator, attached to the various Airflow models, and filtering for the model events you're looking to shuttle off to the message queue.
You could do this in an airflow_local_settings
module so that it's loaded up automatically on startup. Then place some extra configuration values in your airflow.cfg
file that drive the settings for the remote message queue.
Upvotes: 1