Jorge Nachtigall
Jorge Nachtigall

Reputation: 539

Send logs to s3 - Official Airflow Helm Chart + KubernetesExecutor

I'm using the official Airflow Helm Chart to deploy KubernetesExecutor (still locally) on a KinD Cluster.

Because this is a Helm Chart, I'm having a lot of trouble trying to configure anything that are not explicitly shown at the documentation.

Using this scenario, I want to send all my logs data produced by my DAGs to a s3 bucket (which is a common thing to do on the airflow stack).

The problem is: there's nothing on the documentation and even on other threads that can help me achieve this.

Is there anything that I can do?

Upvotes: 0

Views: 1429

Answers (1)

tfleischer
tfleischer

Reputation: 148

I'm not sure what exactly your problem is, but the following values.yaml works for me with the official airflow helm chart:

config:
  logging:
    # Airflow can store logs remotely in AWS S3. Users must supply a remote
    # location URL (starting with either 's3://...') and an Airflow connection
    # id that provides access to the storage location.
    remote_logging: 'True'
    #colored_console_log : 'True'
    remote_base_log_folder : "s3://PATH"
    # the following connection must be created in the airflow web ui
    remote_log_conn_id : 'S3Conn'
    # Use server-side encryption for logs stored in S3
    encrypt_s3_logs : 'True'

Upvotes: 4

Related Questions