Plengo
Plengo

Reputation: 97

How to get Airflow db credentials from Google Cloud Composer

I am in current need of Airflow db connection credentials for my Airflow instance in Cloud Composer.
All I see on Airflow connection UI is airflow_db mysql airflow-sqlproxy-service.

I would like to connect to it via DataGrip.
Another thing is if I want to change the [core] sql_alchemy_conn override environmental variable, how do I do it as it is restrictedw hen I add it on my env variable on Cloud Composer environments.

Upvotes: 5

Views: 2512

Answers (2)

gavinest
gavinest

Reputation: 348

Adding to David's answer:

After following directions to connect to the GKE cluster worker the $AIRFLOW_SQLPROXY_SERVICE_SERVICE_HOST env variable did not exist for me . Instead, I parsed the connection details from the SQLAlchemy connection string env variable that Composer makes when SQLAlchemy is installed.

echo $AIRFLOW__CORE__SQL_ALCHEMY_CONN

Should return a connection string of the form: mysql+mysqlconnector://<user>:<password>@<host>[:<port>]/<dbname>.

Parsing the host, user and dbname from this i'm able to connect with:

mysql -h <host> -u <user> <dbname>

Upvotes: 0

David
David

Reputation: 9721

Cloud Composer isn't designed to give external access to the database. However you can connect to the GKE cluster and then access it from within the cluster. This doc shows how to do that with SQLAlchemy, but you can also get direct MySQL CLI access by running mysql -h $AIRFLOW_SQLPROXY_SERVICE_SERVICE_HOST -u root airflow-db instead of sqlalchemy in step 6.

Upvotes: 2

Related Questions