Vladimir Pimtchenkov
Vladimir Pimtchenkov

Reputation: 41

Workload identity federation with @google-cloud

Does anybody know if there are any other ways of authentication/authorization for access to Google Cloud Storage besides of Service Account key when I use @google-cloud/storage Node.js module from here? I have read about “Workload identity federation”, but it seems for me that I cannot use this approach when I use @google-cloud/storage library. I was not able to find any suitable constructor, only these two:

const {Storage} = require('@google-cloud/storage');
var storage = new Storage({
  projectId   : `my_google_project_id`,
  keyFilename : `my_google_key_file.json`   // service account key is inside of this file
});
// or this one:
var storage = new Storage();    // service account key is inside of file specified by environment variable GOOGLE_APPLICATION_CREDENTIALS

Any recommendations? Thank you

Upvotes: 4

Views: 2296

Answers (2)

Jose A
Jose A

Reputation: 11129

And there isn't any connector. It does it automatically by connecting through the CLI.

Google discourages the usage of service account keys on Google Workspace accounts for Google Cloud.

Here's how you connect depending on the environment

Locally:

  • You use the Google CLI and Service Account Impersonation

In Google Cloud:

  • You attach a service account to the resource with the correct permissions. Every service will have gcloud cli available to you.

In GitHub or others:

  • You enable OIDC and connect through Service Account Impersonation.

Note that you don't need to modify your code (see example below); the CLI will automatically pick it up. Additionally, this issue outlines a problem that you will need to use a Service Account to sign the URL when reading from the bucket.

Here's how you can configure it locally (Service Account Impersonation):

I'll be cross-posting from this answer, as there will be people who will likely stumble upon this issue.

Local Development:

1st. Create the service account:

gcloud iam service-accounts create "local-dev-account" \
    --description="Local Development Account" \
    --project="${YOUR_PROJECT_ID}" \
    --display-name="Local Development Account"

2nd. Create the local_dev_role that will be attached to the service account:

gcloud iam roles create "local_dev_role" \
 --project="${YOUR_PROJECT_ID}" \
 --file="./roles-local.gcp.yml"

Add as many permissions as you need: (These will be independent from the email account that's connected through the CLI) roles-local.gcp.yml:

# gcloud iam roles update local_dev_role --project=alertdown-staging --file=./roles-local.gcp.yml
# https://stackoverflow.com/a/68901952/1057052
# We are currently using Workload Identity Federation to authenticate. This means that there are no service keys passed
# around.
# Therefore, we need another mechanism to authenticate when developing locally.
# In Production, services are authenticated via IAM directly.
# This files contains only the minimum set of permissions required to access the development environment.
# We will generate a local JSON file that will serve as an impersonation to a service account.
# Required roles:

title: Local Development Roles
description: |
  This policy is for local development as Google discourages the usage of service account keys. This will work for impersonation
stage: GA
# https://cloud.google.com/iam/docs/permissions-reference
includedPermissions:
  # Permissions for GCR
  - storage.objects.create
  - storage.objects.delete # Optional: only include if you need to delete images
  - storage.objects.get
  - storage.objects.list
  - storage.objects.update
  - storage.objects.getIamPolicy
  - storage.objects.setIamPolicy
  # Add these permissions for token management
  - iam.serviceAccounts.getAccessToken
  - iam.serviceAccounts.signBlob # Required for signed URLs
  # Add more permissions as needed

3rd. Attach the service account at the project level

gcloud projects add-iam-policy-binding ${YOUR_PROJECT_ID} \
 --member="serviceAccount:local-dev-account@${YOUR_PROJECT_ID}.iam.gserviceaccount.com" \
 --role="projects/${YOUR_PROJECT_ID}/roles/local_dev_role" \
 --project="${YOUR_PROJECT_ID}"

Note: There's a similar command called gcloud iam service-accounts add-iam-policy-binding.... Don't! This will not work.

4th. Impersonate the service account:

gcloud auth application-default login --impersonate-service-account=local-dev-account@alertdown-staging.iam.gserviceaccount.com

If that didn't work, and you're getting errors about the account not getting impersonated:

Attach the roles/iam.serviceAccountTokenCreator to YOUR email address:

gcloud iam service-accounts add-iam-policy-binding \
    local-dev-account@${YOUR_PROJECT_ID}.iam.gserviceaccount.com \
    --member="user:[email protected]" \
    --role="roles/iam.serviceAccountTokenCreator" \
    --project="${YOUR_PROJECT_ID}"

gcloud iam service-accounts add-iam-policy-binding \
    local-dev-account@${YOUR_PROJECT_ID}.iam.gserviceaccount.com \
    --member="user:[email protected]" \
    --role="roles/iam.serviceAccountUser" \
    --project="${YOUR_PROJECT_ID}"

And just for the sake of completeness: Here's the full Storage class wrapper I've created in Node:

import {
  InternalServerException,
  PromiseExceptionResult,
} from '@alertdown/core';
import { Storage as GoogleCloudStorage } from '@google-cloud/storage';
import { Err, Ok } from 'oxide.ts';

const getFileUrl = (bucket: string, filename: string) => {
  return `https://storage.googleapis.com/${bucket}/${filename}`;
};

type ConstructorInput = {
  bucket: string;
  /**
   * Development only
   */
  projectId?: string;
};

export class Storage {
  #client: GoogleCloudStorage;
  #bucket: string;

  constructor(input: ConstructorInput) {
    this.#client = new GoogleCloudStorage({
      projectId: input.projectId,
    });
    this.#bucket = input.bucket;
  }

  async upload(file: Buffer, filename: string): PromiseExceptionResult<string> {
    try {
      const bucket = this.#client.bucket(this.#bucket);
      const blob = bucket.file(filename);

      await blob.save(file, {
        metadata: {
          contentType: 'image/png',
        },
      });
      return Ok(getFileUrl(this.#bucket, filename));
    } catch (error) {
      return Err(new InternalServerException(error as Error));
    }
  }

  async resolveUrl(url: string): PromiseExceptionResult<string> {
    try {
      const parsedUrl = new URL(url);
      const filename = parsedUrl.pathname.split('/').pop();
      console.log((await this.#client.authClient.getClient()).credentials);

      if (!filename) {
        return Err(new InternalServerException('Invalid URL format'));
      }

      const bucket = this.#client.bucket(this.#bucket);
      const file = bucket.file(filename);
      const [signedUrl] = await file.getSignedUrl({
        version: 'v4',
        action: 'read',
        expires: Date.now() + 15 * 60 * 1000, // 15 minutes
      });
      return Ok(signedUrl);
    } catch (error) {
      console.error(`Error resolving URL: ${url}`, error);
      return Err(new InternalServerException(error as Error));
    }
  }
}

As you can see, I only passed the projectId (YOUR_PROJECT_ID) and bucket name.

Additional Notes:

  • List of Permissions - Google Cloud
  • [View Roles Permissionshttps://console.cloud.google.com/iam-admin/roles) (You can view the individual roles permissions here. Then you can add it to your yam file as needed)
  • Every time you update a permission, if it doesn't work for any reason, try reauthenticating.

Helpful Commands For Debugging:

  1. Check if the role exists and its permissions
gcloud iam roles describe local_dev_role \
    --project=${YOUR_PROJECT_ID}
  1. Verify if the role is attached to the service account:
gcloud projects get-iam-policy ${YOUR_PROJECT_ID} \
    --filter="bindings.members:local-dev-account@${YOUR_PROJECT_ID}.iam.gserviceaccount.com" \
    --format="table(bindings.role)"
  1. Command to update the role from the yaml (roles-local-gcp.yml)
gcloud iam roles update local_dev_role \
    --project=${YOUR_PROJECT_ID} \
    --file=./roles-local.gcp.yml

Upvotes: 0

John Hanley
John Hanley

Reputation: 81454

Most Google Clients support a new secrets key file with the type external_account. The following demonstrates how to create this file and setup Application Default Credentials (ADC) to load this file.

To use Workload Identity Federation with Google Client libraries, save the federated credentials to a file and then specify that file via the environment variable GOOGLE_APPLICATION_CREDENTIALS. The Storage client will use ADC and locate the credentials from the environment.

Example for AWS:

# Generate an AWS configuration file.
gcloud iam workload-identity-pools create-cred-config \
    projects/$PROJECT_NUMBER/locations/global/workloadIdentityPools/$POOL_ID/providers/$AWS_PROVIDER_ID \
    --service-account $SERVICE_ACCOUNT_EMAIL \
    --aws \
    --output-file /path/to/generated/config.json

Example for Azure:

# Generate an Azure configuration file.
gcloud iam workload-identity-pools create-cred-config \
    projects/$PROJECT_NUMBER/locations/global/workloadIdentityPools/$POOL_ID/providers/$AZURE_PROVIDER_ID \
    --service-account $SERVICE_ACCOUNT_EMAIL \
    --azure \
    --output-file /path/to/generated/config.json

Note: I generated my credentials on an Azure VM. I added the following command line option to the above command:

--app-id-uri=https://iam.googleapis.com/projects/REDACTED/locations/global/workloadIdentityPools/pool-azure/providers/provider-id

The output-file value is used to set the environment:

set GOOGLE_APPLICATION_CREDENTIALS=/path/to/generated/config.json

The file has the following structure. This example is for Azure:

{
  "type": "external_account",
  "audience": "//iam.googleapis.com/projects/REDACTED/locations/global/workloadIdentityPools/pool-azure/providers/provider-id",
  "subject_token_type": "urn:ietf:params:oauth:token-type:jwt",
  "token_url": "https://sts.googleapis.com/v1/token",
  "credential_source": {
    "url": "http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https://iam.googleapis.com/projects/REDACTED/locations/global/workloadIdentityPools/pool-azure/providers/provider-id",
    "headers": {
      "Metadata": "True"
    },
    "format": {
      "type": "json",
      "subject_token_field_name": "access_token"
    }
  },
  "service_account_impersonation_url": "https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/[email protected]:generateAccessToken"
}

Use this style to create a client:

var storage = new Storage();

Upvotes: 1

Related Questions