Reputation: 733
I have a up and running Apache - Airflow 1.8.1 instance.
I got a working connection (and it's ID) to write to Google Cloud Storage and my airflow user has the permission to write to the bucket.
I try to use the remote log storage functionality by adding
remote_base_log_folder = 'gs://my-bucket/log'
remote_log_conn_id = 'my_working_conn_id'
And that's all (I didn't touch any configuration but that)
I restarted all the services but the log aren't uploading to gcs (my bucket it's still empty) and my filesystem space is still decreasing.
Have you enabled successfully remote log with gcs? If yes, what did you change / do?
Upvotes: 3
Views: 4097
Reputation: 716
I manage to get the remote log to GCS. First, you need to give the service account permission to write to GCS bucket.
This is my GCP connection set up:
Then, edit the airflow.cfg file:
remote_base_log_folder = gs://my-backup/airflow_logs
remote_log_conn_id = my_gcp_conn
After editing the config file, you need to re-initialize it again:
airflow initdb
# start the web server, default port is 8080
airflow webserver -p 8080
Testing by turning on the "tutorial" DAG, you should be able to see the logs both locally and remotely in GCS:
Upvotes: 5