Sheikh Abdul Manan
Sheikh Abdul Manan

Reputation: 81

Run Airflow HA Scheduler as systemd services

I want to run 2 airflow schedulers for which i created a systemd service file ending with @.service. Now if I try to run the service like

sudo systemctl start airflow-scheduler@{1..2} 

Only one of the schedulers manages to run, while the other one runs into an error which says

sqlalchemy.exc.DatabaseError: (mysql.connector.errors.DatabaseError) 3572 (HY000): Statement aborted because lock(s) could not be acquired immediately and NOWAIT is set.

My service file looks like this:

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=myuser
Group=myuser
Type=simple
ExecStart=/usr/local/bin/airflow scheduler
Restart=always
RestartSec=5s

[Install]
WantedBy=multi-user.target

Upvotes: 1

Views: 374

Answers (1)

Sheikh Abdul Manan
Sheikh Abdul Manan

Reputation: 81

The problem was with mysql connector python. I used mysqldb instead in the airflow config file and it works fine now.

Upvotes: 1

Related Questions