NoobZik
NoobZik

Reputation: 149

Deploying a Kedro project to cloud build with pytest accessing to google cloud storage

I am trying to deploy my Kedro instance in Cloud Run which my cloudbuild.yml look like this :

steps:
- name: "noobzik/uv-gcp-cloud-build"
  id: CI
  entrypoint: /bin/bash
  env:
    - PROJECT_ID=$PROJECT_ID
    - SERVICE_ACCOUNT=$_SERVICE_ACCOUNT
  args:
  - -c
  - |
    echo "$SERVICE_ACCOUNT" | base64 -d > service_account.json
    gcloud auth activate-service-account --key-file=service_account.json
    gcloud config set project "$PROJECT_ID"
    chmod a+x install.sh && 
    ./install.sh &&
    source .venv/bin/activate &&
    pytest .
#- name: "noobzik/uv-gcp-cloud-build"
#  id: CD
#  entrypoint: /bin/bash
#  args:
#  - -c
#  - 'chmod a+x install.sh && ./install.sh && kedro run --pipeline global'
#  env:
#  - 'ENV=$BRANCH_NAME'
#  - 'MLFLOW_SERVER=$_MLFLOW_SERVER'

logs_bucket: gs://purchase_predict

The reason I use my own docker image is to use Astral UV to speed up the deployment regarding the pip requirement installs.

The issue is Pytest can't pass the unit test at my node named

tests/pipelines/loading/test_pipeline.py F                               [ 40%]

Which it basically grab a spark generated csv folder named primary.csv located at gs://purchase_predict/primary.csv The reason it can't pass is because my build is not authenticated by Google Cloud Build. So I tried passing the json key as a Substitution variables (both encode as 64 and plain) into _SERVICE_ACCOUNT but it's not working

base64: invalid input
ERROR: (gcloud.auth.activate-service-account) Could not read json file service_account.json: Expecting value: line 1 column 1 (char 0)

I am starting to run out of solution. For some reason I don't know, I gave roles access to my service account to access gcs

Any help is appreciated

Upvotes: 0

Views: 37

Answers (0)

Related Questions