Gary Turner
Gary Turner

Reputation: 367

403 trying to run terraform from Gitlab without json file

After a pile of troubleshooting, I managed to get my gitlab CICD pipeline to connect to GCP without requiring my service account to use a JSON key. However, I'm unable to do anything with Terraform in my pipeline using a remote statefile because of the following error:

Error: Failed to get existing workspaces: querying Cloud Storage failed: googleapi: Error 403: Insufficient Permission, insufficientPermissions

My gitlab-ci.yml file is defined as follows:

stages:
  - auth
  - validate

gcp-auth:
  stage: auth
  image: google/cloud-sdk:slim
  script:
    - echo ${CI_JOB_JWT_V2} > .ci_job_jwt_file
    - gcloud iam workload-identity-pools create-cred-config ${GCP_WORKLOAD_IDENTITY_PROVIDER}
      --service-account="${GCP_SERVICE_ACCOUNT}"
      --output-file=.gcp_temp_cred.json
      --credential-source-file=.ci_job_jwt_file
    - gcloud auth login --cred-file=`pwd`/.gcp_temp_cred.json
    - gcloud auth list

tf-stuff:
  stage: validate
  image:
    name: hashicorp/terraform:light
    entrypoint:
      - '/usr/bin/env'
      - 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
  before_script:
    - export TF_LOG=DEBUG
    - cd terraform
    - rm -rf .terraform
    - terraform --version
    - terraform init
  script:
    - terraform validate

My gcp-auth job is running successfully from what I can see:

Authenticated with external account credentials for: [[MASKED]].

I've also went as far as adding in a gsutil cp command inside the gcp-auth job to make sure I can access the desired bucket as expected, which I can. I can successfully edit the contents of the bucket where my terraform statefile is stored.

I'm fairly new to gitlab CICD pipelines. Is there something I need to do to have the gcp-auth job tied to the tf-stuff job? It's like that job does not know the pipeline was previously authenticated using the service account.

Thanks!

Upvotes: 0

Views: 1147

Answers (2)

Anders Elton
Anders Elton

Reputation: 871

Like mentioned by other posters, gitlab jobs run independently and dont share env variables or filesystem. So to preserve login state betwen jobs you have to preserve the state somehow.

I wrote a blog with a working example: https://ael-computas.medium.com/gcp-workload-identity-federation-on-gitlab-passing-authentication-between-jobs-ffaa2d51be2c

I have done it like github actions is doing it, by storing (tmp) credentials as artifacts. By setting correct env variables you should be able to "keep" the logged in state (gcp will implicitly refresh your token) without you having to create a base image containing everything. All jobs must run the gcp_auth_before method, or extend the auth job for this to work. and also have _auth/ artifacts preserved between jobs

In the sample below you can see that login state is preserved over two jobs, but only actuallt signing in on the first one. I have used this together with terraform images for further steps and it works like a charm so far.

This is very early so there might be hardening required for production.

Hope this example gives you some ideas on how to solve this!

.gcp_auth_before: &gcp_auth_before
  - export GOOGLE_APPLICATION_CREDENTIALS=$CI_PROJECT_DIR/_auth/.gcp_temp_cred.json
  - export CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE=$CI_PROJECT_DIR/_auth/.gcp_temp_cred.json
  - export GOOGLE_GHA_CREDS_PATH=$CI_PROJECT_DIR/_auth/.gcp_temp_cred.json
  - export GOOGLE_CLOUD_PROJECT=$(cat $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT)
  - export CLOUDSDK_PROJECT=$(cat $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT)
  - export CLOUDSDK_CORE_PROJECT=$(cat $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT)
  - export GCP_PROJECT=$(cat $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT)
  - export GCLOUD_PROJECT=$(cat $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT)

.gcp-auth:
  artifacts:
    paths:
      - _auth/
  before_script:
    *gcp_auth_before

stages:
  - auth
  - debug

auth:
  stage: auth
  image: "google/cloud-sdk:slim"
  variables:
     SERVICE_ACCOUNT_EMAIL: "... service account email ..."
     WORKLOAD_IDENTITY_PROVIDER: "projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL/providers/PROVIDER"
     GOOGLE_CLOUD_PROJECT: "... project id ...."
  artifacts:
    paths:
      - _auth/
  script:
    - |
      mkdir -p _auth 
      echo "$CI_JOB_JWT_V2" > $CI_PROJECT_DIR/_auth/.ci_job_jwt_file
      echo "$GOOGLE_CLOUD_PROJECT" > $CI_PROJECT_DIR/_auth/.GOOGLE_CLOUD_PROJECT
      gcloud iam workload-identity-pools create-cred-config \
      $WORKLOAD_IDENTITY_PROVIDER \
      --service-account=$SERVICE_ACCOUNT_EMAIL \
      --service-account-token-lifetime-seconds=600 \
      --output-file=$CI_PROJECT_DIR/_auth/.gcp_temp_cred.json \
      --credential-source-file=$CI_PROJECT_DIR/_auth/.ci_job_jwt_file
      gcloud config set project $GOOGLE_CLOUD_PROJECT
    - "export GOOGLE_APPLICATION_CREDENTIALS=$CI_PROJECT_DIR/_auth/.gcp_temp_cred.json"
    - "gcloud auth login --cred-file=$GOOGLE_APPLICATION_CREDENTIALS"
    - gcloud auth list # DEBUG!!

debug:
  extends: .gcp-auth
  stage: debug
  image: "google/cloud-sdk:slim"
  script:
    - env
    - gcloud auth list
    - gcloud storage ls

Upvotes: 2

Mazlum Tosun
Mazlum Tosun

Reputation: 6572

Your two Gitlab job run on a separated pod for the Kubernetes runner.

The tf-stuff job doesn't see the authentication done in the job gcp-auth.

To solve this issue, you can add the authentication code logic in a separated Shell script, then reuse this script in the two Gitlab jobs, example :

Authentication Shell script gcp_authentication.sh :

echo ${CI_JOB_JWT_V2} > .ci_job_jwt_file
gcloud iam workload-identity-pools create-cred-config ${GCP_WORKLOAD_IDENTITY_PROVIDER}
      --service-account="${GCP_SERVICE_ACCOUNT}"
      --output-file=.gcp_temp_cred.json
      --credential-source-file=.ci_job_jwt_file
gcloud auth login --cred-file=`pwd`/.gcp_temp_cred.json
gcloud auth list

# Check if you need to set GOOGLE_APPLICATION_CREDENTIALS env var on `pwd`/.gcp_temp_cred.json

For the tf-stuff, you can create a custom Docker image containing gcloud and Terraform because the image hashicorp/terraform doesn't contains gcloud cli natively.

Your Docker image can be added in Gitlab registry

Your Gitlab yml file :

stages:
  - auth
  - validate

gcp-auth:
  stage: auth
  image: google/cloud-sdk:slim
  script:
    - . ./gcp_authentication.sh

tf-stuff:
  stage: validate
  image:
    name: yourgitlabregistry/your-custom-image:1.0.0
    entrypoint:
      - '/usr/bin/env'
      - 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
  before_script:
    - . ./gcp_authentication.sh
    - export TF_LOG=DEBUG
    - cd terraform
    - rm -rf .terraform
    - terraform --version
    - terraform init
  script:
    - terraform validate

Some explanations :

  • The same Shell script has been used in the 2 Gitlab jobs : gcp_authentication.sh
  • A custom Docker image has been created with Terraform and gcloud cli in the job concerning the Terraform part. This image can be added to the Gitlab registry
  • In the authentication Shell script, check if you need to set GOOGLE_APPLICATION_CREDENTIALS env var on pwd/.gcp_temp_cred.json

You have to give the needed permission to your Service Account to use Gitlab with Workload Identity :

  • roles/iam.workloadIdentityUser

You can check this example project and the documentation

Upvotes: 0

Related Questions