Reputation: 271
We've been running a daily automated full database exports from Google Cloud SQL to Google Cloud Storage (across projects) using Cloud Functions to trigger the export. (See this article)
Starting 4 days ago, we're getting a 403 error with the logs showing: "The service account does not have the required permissions for the bucket."
We haven't made any permissions changes and running:
gsutil acl get [BUCKET]
still shows that our Google Cloud SA has the role "WRITER" as we'd expect.
From our perspective, it seems like we've randomly lost permissions for this workflow. Does anyone have any suggestions on how to debug this and get this workflow working again?
Edit: This seems like a bug that was introduced recently, https://issuetracker.google.com/issues/166478544
Upvotes: 0
Views: 1396
Reputation: 951
Or just use:
# Grant the permissions to the service account of the Cloud SQL instance created by default
CLOUD_SQL_SERVICE_ACCOUNT=$(gcloud sql instances describe $INSTANCE_ID --format="value(serviceAccountEmailAddress)")
gsutil iam ch serviceAccount:${CLOUD_SQL_SERVICE_ACCOUNT}:roles/storage.objectUser \
"gs://$BUCKET_NAME"
Upvotes: 0
Reputation: 4670
Yes, you are correct, it seems that this is, indeed, a bug that seems to be affecting the situation. Regarding your question in the public issue that you linked, it seems that it's related indeed, to the permissions in a bucket level. Considering the fact that the bug is affecting you at a bucket level, makes more sense to use permission within this level, than modifying the whole roles from your IAM. For you to modify the permissions in a bucket level, you can follow the instructions indicated here.
Or, just follow the below steps.
storage.buckets.get
are available here.Add
, once you are done.Anyway, I would recommend you to follow the workaround and wait for updates in the public issue, until it's fixed. However, keep in mind that as the official documentation indicates, bucket level access is actually recommended.
Upvotes: -1