Reputation: 84
I believe I have hit a dead end but i would like some validation from an expert.
I started to learn GCP this week and I am trying to achieve something that I was previously able to achieve with the AWS cloud platform. Please bare with me as I know nothing about the GCP.
Problem:
On a local machine, configure dynamically fetched, short-lived credentials for authenticating with public cloud apis (static keys are out of question),
for when an authenticated human user (GCP: IAM User, AWS: IAM User) is impersonating a non-human identity (GCP: IAM Service Accout, AWS: IAM Role),
in a way that the client SDK's offered by the public cloud platform use those credentials as default (impersonated non-human) identity when they make calls to the public cloud platform APIs.
Let us say, on my local machine: I have an authenticated User A, who needs to make calls to the public cloud apis on behalf of ServiceAccount/Role B.
Platform:
AWS - Solved
GCP - Unsolved
AWS Solution
In AWS, the problem statement will be: From my local machine, where I am authenticated as User-A, I want my local AWS client SDK to make calls to AWS APIs as IAM-ROLE-B
The solution is the following:
1 - Authorize User-A to perform sts:AssumeRole on IAM-ROLE-B via an IAM Role Policy on User-A
2 - Authorize User-A to perform sts:AssumeRole on IAM-ROLE-B via an IAM Assume Role Policy on User-B
3 - Use, for example, the aws cli to make an AssumeRole operation call to AWS STS API. This will return me: a. Access Key ID and b. Secret Access Key
4 - Set these in the ~/.aws/.creds file and all the SDKs default to these creds
Barrier in GCP
The problem occurs when I try to simulate the same behavior for GCP.
From my local machine, where I am authenticated as User-A, I want my local GCP client SDK to make calls to GCP APIs as SERVICE-ACCOUNT-B
Since i need dynamic short-lived creds, I can not use a service account key file which is static, and too insecure, and we know good reasons to not use it.
The only other option that I am aware of is to "impersonate" a service account (make calls on behalf of SERVICE-ACCOUNT-B) via the projects.serviceAccounts.generateAccessToken. From my AWS analogy, I receive the access Key ID and access key secret in Step 3, that I can set globally (to be used by the client SDKs).
In the case of GCP, the projects.serviceAccounts.generateAccessToken call gives me an AccessToken, which is basically the short lived authentication to make calls on behalf of SERVICE-ACCOUNT-B.
Although I can inject this access token into requests I intend to make to future calls to GCP, is there any way to set this globally to be picked up automatically by GCP client SDKs, where I do not have to modify my code base and inject this token somehow in my code?
Recalling the fact that I am a total GCP newbie, am I even thinking in the right direction? Or did I completely mess up GCP auth concepts and go the wrong way in my approach to solve the problem?
Any input and ideas to solve the problem (without static keys) are highly appreciated.
Upvotes: 0
Views: 1251
Reputation: 158
No, the approach that you are looking for in GCP when using gcloud
, API requests or client libraries is not possible like in AWS. As you've already mentioned the best approach is impersonating service accounts.
You should consider that at every time you impersonate a service account you should do it explicitly since you are not able to store the credentials like you do it in AWS, for example:
gcloud compute instances create sample_vm --zone=us-central1-a \
--machine-type=n1-standard-1 --image=centos-8-v20200811 \
--image-family centos-8 --image-project=centos-cloud \
--impersonate-service-account=impersonated@project_id.iam.gserviceaccount.com
or when using the client libraries:
from google.cloud import storage
from google.auth import impersonated_credentials
import google.auth
target_scopes=[ 'https://www.googleapis.com/auth/devstorage.read_only' ]
my_credentials, project = google.auth.default()
impersonated_credentials = impersonated_credentials.Credentials(
source_credentials=my_credentials,
target_principal='[email protected]',
target_scopes=target_scopes,
lifetime=60)
client = storage.Client(credentials=impersonated_credentials)
blobs = client.list_blobs('bucket_name')
for blob in blobs:
print(blob.name)
And finally you should always include the token when doing the API request which is what you are doing right now.
Upvotes: 2