TAPeri
TAPeri

Reputation: 13

Scheduler not updating package files

I'm developing a DAG on Cloud Composer; my code is separated into a main python file and one package with subfolders, it looks like this:

my_dag1.py
package1/__init__.py
package1/functions.py
package1/package2/__init__.py
package1/package2/more_functions.py

I updated one of the functions on package1/functions.py to take an additional argument (and update the reference in my_dag1.py). The code would run correctly on my local environment and I was not getting any errors when running

gcloud beta composer environments run my-airflow-environment list_dags --location europe-west1

But the Web UI raised a python error

TypeError: my_function() got an unexpected keyword argument 'new_argument'


The error disappeared only after I renamed the package folder.


I suspect the issue is related to the scheduler picking up my_dag1.py but not package1/functions.py. The error appeared out of nowhere as I have made similar updates on the previous weeks.

Any idea on how to fix this issue without refactoring the whole code structure?


EDIT-1

Here's the link to related discussion on Google Groups

Upvotes: 1

Views: 3804

Answers (2)

Franco Piccolo
Franco Piccolo

Reputation: 7430

Try restarting the webserver with:

gcloud beta composer environments restart-web-server ENVIRONMENT_NAME --location=LOCATION

Upvotes: 0

Kun Wang
Kun Wang

Reputation: 81

I've run into a similar issue. the "Broken DAG" error won't dismiss in Web UI. I guess this is a cache bug in Web server of AirFlow.

  • Background.

    1. I created a customized operator with Airflow Plugin features.
    2. After I import the customized operator, the airflow Web UI keep shows the Broken DAG error says that it can't find the customized operator.
  • Why I think it's a bug in Web server Airflow?

    1. I can manually run the DAG with the command airflow test, so the import should be correct.
    2. Even if I remove the related DAG file from the /dags/ folder of airflow, the error still there.
  • Here are What I did to resolve this issue.

    1. restart airflow web service. (sometimes you can resolve the issue only by this).
    2. make sure no DAG is running, restart airflow scheduler service.
    3. make sure no DAG is running, restart airflow worker

Hopefully, it can help someone has the same issue.

Upvotes: 1

Related Questions