Sagar Bhatt
Sagar Bhatt

Reputation: 31

Logs saved locally and on AWS S3, but not loading in Airflow UI. (Airflow version 2.5.1)

I'm experiencing an issue with my Apache Airflow setup where the logs are being successfully saved both locally and on AWS S3. However, when I try to view these logs in the Airflow UI, they do not load, and the UI keeps spinning.

logging

base_log_folder = /home/ubuntu/airflow/logs
remote_logging = True
remote_log_conn_id = daairflowupgradelogs
google_key_path =
remote_base_log_folder = s3://da-airflow-logging/daairflowupgradelogs
encrypt_s3_logs = False
logging_level = INFO
celery_logging_level =
fab_logging_level = WARNING
logging_config_class =
colored_console_log = True
colored_log_format = [%(blue)s%(asctime)s%(reset)s] {%(blue)s%(filename)s:%(reset)s%(lineno)d} %(log_color)s%(levelname)s%(reset)s - %(log_color)s%(message)s%(reset)s
colored_formatter_class = airflow.utils.log.colored_log.CustomTTYColoredFormatter
log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s
simple_log_format = %(asctime)s %(levelname)s - %(message)s
dag_processor_log_target = file
dag_processor_log_format = [%(asctime)s] [SOURCE:DAG_PROCESSOR] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s
log_formatter_class = airflow.utils.log.timezone_aware.TimezoneAware
task_log_prefix_template =
log_filename_template = dag_id={{ ti.dag_id }}/run_id={{ ti.run_id }}/task_id={{ ti.task_id }}/{% if ti.map_index >= 0 %}map_index={{ ti.map_index }}/{% endif %}attempt={{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
dag_processor_manager_log_location=/home/ubuntu/airflow/logs/dag_processor_manager/dag_processor_manager.log
task_log_reader = task
extra_logger_names =
worker_log_server_port = 8793

I've configured an AWS connection (aws_default) in Airflow, and the IAM role associated with it has the necessary permissions to access the S3 bucket.

Additional Information:

  1. I'm using Airflow in an Amazon Elastic Kubernetes Service (EKS) environment.
  2. I've verified that the logs are indeed being saved to the specified S3 bucket.
  3. The IAM roles and permissions seem to be correctly configured for both EKS and S3.
  4. I've checked the Airflow worker pods, and they appear to have the correct environment variables and AWS credentials.

airflow UI

I've tried:

I would greatly appreciate any insights or guidance on how to resolve this issue and get the logs to load in the Airflow UI.

Upvotes: 2

Views: 425

Answers (1)

Maria Dubyaga
Maria Dubyaga

Reputation: 41

Check the logs of your webserver pod. Most likely it's complaining about no credentials (it's uploaded to s3 with your worker, but you want to stream it back with webserver) - check if your webserver Service account has correct annotations.

Upvotes: 0

Related Questions