arturkuchynski
arturkuchynski

Reputation: 1000

Airflow CeleryExecutor - 'int' object has no attribute 'startswith' in Celery

Airflow 2.0 is queuing but not launching tasks in my dev environment.

DAG and Pool settings are valid, but all tasks in each dag are queued when I trigger them, and are never running.

When typing airflow celery worker, I've got the following error:

Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
    args.func(args)
  File "/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 188, in worker
    _run_worker(options=options, skip_serve_logs=skip_serve_logs)
  File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 94, in _run_worker
    celery_app.worker_main(options)
  File "/usr/local/lib/python3.8/site-packages/celery/app/base.py", line 365, in worker_main
    return instantiate(
  File "/usr/local/lib/python3.8/site-packages/celery/bin/base.py", line 283, in execute_from_commandline
    self.maybe_patch_concurrency(argv)
  File "/usr/local/lib/python3.8/site-packages/celery/bin/base.py", line 315, in maybe_patch_concurrency
    maybe_patch_concurrency(argv, *pool_option)
  File "/usr/local/lib/python3.8/site-packages/celery/__init__.py", line 143, in maybe_patch_concurrency
    pool = _find_option_with_arg(argv, short_opts, long_opts)
  File "/usr/local/lib/python3.8/site-packages/celery/__init__.py", line 95, in _find_option_with_arg
    if arg.startswith('-'):
AttributeError: 'int' object has no attribute 'startswith'

In my staging, prod environment there are no issues with running tasks as well as if I check airflow celery worker. It will start or warn me that it is already running (as expected).

There are no difference between envs, but I guess that the problem occurred after most-recent deploy on server.

As I can see, celery received the wrong argument:

AttributeError: 'int' object has no attribute 'startswith'

But how to trace which params Airflow trying to pass to celery? I have no Idea how to debug this.

Upvotes: 3

Views: 1677

Answers (2)

Geobio Boo
Geobio Boo

Reputation: 1354

For airflow 2.1.0+, you need celery 5.1.2+

Documentation is here: https://airflow.apache.org/docs/apache-airflow-providers-celery/stable/index.html#pip-requirements which explains the need for celery 5.1.2+

Upvotes: 2

arturkuchynski
arturkuchynski

Reputation: 1000

Solved by upgrading celery to It's latest version from 4.4.2 to 5.1.2.

Seems like version 4.4.2 (which is a one of airflow deps) had a bug with arguments.

If there are any other suggestions how to solve this issue, feel free to mention them here.

Upvotes: 7

Related Questions