halil
halil

Reputation: 1812

Airflow scheduler can not connect to Kubernetes service api

I am trying to setup airflow with Kubernetes executor and on scheduler container startup it hangs for a while and then I get https timeout error as follows. The ip address in message is correct and inside container I can run curl kubernetes:443 or curl 10.96.0.1:443 or nc -zv 10.96.0.1 443 so I assume there is no firewall or so blocking access.

I am using local kubernetes as well as aws EKS but same error, I can see that ip changes in different clusters.

I have looked at google to find a solution but did not see similar cases.

│   File "/usr/local/lib/python3.6/site-packages/airflow/contrib/executors/kubernetes_executor.py", line 335, in run                                                                                                                                                                                                                               │
│     self.worker_uuid, self.kube_config)                                                                                                                                                                                                                                                                                                          │
│   File "/usr/local/lib/python3.6/site-packages/airflow/contrib/executors/kubernetes_executor.py", line 359, in _run                                                                                                                                                                                                                              │
│     **kwargs):                                                                                                                                                                                                                                                                                                                                   │
│   File "/usr/local/lib/python3.6/site-packages/kubernetes/watch/watch.py", line 144, in stream                                                                                                                                                                                                                                                   │
│     for line in iter_resp_lines(resp):                                                                                                                                                                                                                                                                                                           │
│   File "/usr/local/lib/python3.6/site-packages/kubernetes/watch/watch.py", line 48, in iter_resp_lines                                                                                                                                                                                                                                           │
│     for seg in resp.read_chunked(decode_content=False):                                                                                                                                                                                                                                                                                          │
│   File "/usr/local/lib/python3.6/site-packages/urllib3/response.py", line 781, in read_chunked                                                                                                                                                                                                                                                   │
│     self._original_response.close()                                                                                                                                                                                                                                                                                                              │
│   File "/usr/local/lib/python3.6/contextlib.py", line 99, in __exit__                                                                                                                                                                                                                                                                            │
│     self.gen.throw(type, value, traceback)                                                                                                                                                                                                                                                                                                       │
│   File "/usr/local/lib/python3.6/site-packages/urllib3/response.py", line 430, in _error_catcher                                                                                                                                                                                                                                                 │
│     raise ReadTimeoutError(self._pool, None, "Read timed out.")                                                                                                                                                                                                                                                                                  │
│ urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='10.96.0.1', port=443): Read timed out.

update: I found my problem, but no solution yet. https://github.com/kubernetes-client/python/issues/990

Upvotes: 2

Views: 3947

Answers (1)

midNight
midNight

Reputation: 165

There is an option to set the value via the ENV variable. In your charts/airflow.yaml file, you can set the variable as follows and that should solve your problem,

AIRFLOW__KUBERNETES__KUBE_CLIENT_REQUEST_ARGS: {"_request_timeout" : [50, 50]}

airflow.yaml full code

airflow:
  image:
     repository: airflow-docker-local
     tag: 1
  executor: Kubernetes
  service:
    type: LoadBalancer
  config:
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://postgres:airflow@airflow-postgresql:5432/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://postgres:airflow@airflow-postgresql:5432/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:airflow@airflow-redis-master:6379/0
    AIRFLOW__CORE__REMOTE_LOGGING: True
    AIRFLOW__CORE__REMOTE_LOG_CONN_ID: my_s3_connection
    AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: s3://xxx-airflow/logs
    AIRFLOW__WEBSERVER__LOG_FETCH_TIMEOUT_SEC: 25
    AIRFLOW__CORE__LOAD_EXAMPLES: True
    AIRFLOW__WEBSERVER__EXPOSE_CONFIG: True
    AIRFLOW__CORE__FERNET_KEY: -xyz=
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-docker-local
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: 1
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_IMAGE_PULL_POLICY: Never
    AIRFLOW__KUBERNETES__WORKER_SERVICE_ACCOUNT_NAME: airflow
    AIRFLOW__KUBERNETES__DAGS_VOLUME_CLAIM: airflow
    AIRFLOW__KUBERNETES__NAMESPACE: airflow
    AIRFLOW__KUBERNETES__KUBE_CLIENT_REQUEST_ARGS: {"_request_timeout" : [50, 50]}


persistence:
  enabled: true
  existingClaim: ''

workers:
  enabled: true

postgresql:
  enabled: true

redis:
  enabled: true

Upvotes: 1

Related Questions