GCPStart
GCPStart

Reputation: 11

How to connect to composer dag folder from GCP Cloud shell

I'm new to GCP. going over different documents on gcp composer and cloud shell but not able to find a place where I can connect the cloud shell environment to the composer DAG folder.

Right now, I'm creating python script outside cloud shell (local system), uploading manually to DAG folder but i want to do this on the cloud shell only. can any one give me the directions on it?

Also when I tried to use import airflow in my python file on cloud shell it gives me error that module not found. how do I install that too?

Upvotes: 0

Views: 1324

Answers (1)

Jaime López
Jaime López

Reputation: 466

Take alook on this GCP documentation:

Adding and Updating DAGs (workflows)

among many other entries, you will find information like this one:

Determining the storage bucket name
To determine the name of the storage bucket associated with your environment:

gcloud composer environments describe ENVIRONMENT_NAME \
  --location LOCATION \
  --format="get(config.dagGcsPrefix)"

where:

ENVIRONMENT_NAME is the name of the environment.
LOCATION is the Compute Engine region where the environment is located.
--format is an option to specify only the dagGcsPrefix property instead of all environment details.
The dagGcsPrefix property shows the bucket name:

gs://region-environment_name-random_id-bucket/

Adding or updating a DAG
To add or update a DAG, move the Python .py file for the DAG to the environment's dags folder in Cloud Storage.

gcloud composer environments storage dags import \
    --environment ENVIRONMENT_NAME \
    --location LOCATION \
    --source LOCAL_FILE_TO_UPLOAD

where:

ENVIRONMENT_NAME is the name of the environment.
LOCATION is the Compute Engine region where the environment is located.
LOCAL_FILE_TO_UPLOAD is the DAG to upload.

Upvotes: 1

Related Questions