TIto
TIto

Reputation: 23

Load Variable in Airflow 2

I am attempting to follow the Airflow documentation to use variables in a DAG.

The variables are set in the Airflow UI and I created a simple DAG use the variables. Loading the variables as shown in the documentation results in the following error:

[2021-02-18 16:32:15,350] {variable.py:64} ERROR - Can't decrypt _val for key=foo, invalid token or value
[2021-02-18 16:32:15,364] {taskinstance.py:1396} ERROR - 'Variable foo does not exist'

Below are the set variables and the code I am using. Any help would be greatly appreciated.

Screenshot of variables

"""
### Load Variable
"""
import airflow.utils.dates
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.models import Variable

dag = DAG(
    dag_id="load_var",
    tags=['example'],
    description='load variables',
    start_date=airflow.utils.dates.days_ago(1),
    schedule_interval="@once",
    max_active_runs=1,
)


def _get_data():
    foo = Variable.get("foo")
    print(foo)

    foo_json = Variable.get("foo_baz", deserialize_json=True)
    print(foo_json)

get_data = PythonOperator(
    task_id="get_data",
    python_callable=_get_data,
    dag=dag,
)

get_data

Upvotes: 2

Views: 2485

Answers (1)

NicoE
NicoE

Reputation: 4873

I think the problem is related to the Fernet key being different between the services you are running, or, even the fact that the variables created from the UI were stored on the DB with a different Fernet key that the one you are currently using.

To solve this, you could provide a key using a environment variable. To generate such key, the docs provide this snippet:

from cryptography.fernet import Fernet
fernet_key= Fernet.generate_key()
print(fernet_key.decode()) 

Get the obtained value and assign it to an environment variable. Following the docker-file you provided, could be something like this:


# ====================================== AIRFLOW ENVIRONMENT VARIABLES ============================
x-environment: &airflow_environment
  - AIRFLOW__CORE__FERNET_KEY=your_fernet_key_here
  - AIRFLOW__CORE__EXECUTOR=LocalExecutor
  - AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS=False
  - AIRFLOW__CORE__LOAD_EXAMPLES=False
  - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql://airflow:airflow@postgres:5432/airflow
  - AIRFLOW__CORE__STORE_DAG_CODE=True
  - AIRFLOW__CORE__STORE_SERIALIZED_DAGS=True
  - AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True

If you have previous data on the metadata DB (like the variables you were testing with) you should drop it with airflow resetdb (removes everything from the DB). Be advised that by doing this your Fernet Key will be exposed as plain text on the docker-compose file, which is not production suitable.

Hope that works for you!

Upvotes: 1

Related Questions