Reputation:
blob.upload_from_filename(source)
gives the error
raise exceptions.from_http_status(response.status_code, message, >response=response) google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: ('Request failed with status >code', 403, 'Expected one of', )
I am following the example of google cloud written in python here!
from google.cloud import storage
def upload_blob(bucket, source, des):
client = storage.Client.from_service_account_json('/path')
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket)
blob = bucket.blob(des)
blob.upload_from_filename(source)
I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.
Upvotes: 4
Views: 16344
Reputation: 1381
For me, the problem was this: when initializing the app, the value of the storageBucket
must NOT start like this: gs://myapp.appspot.com
, so remove the prefix gs://
.
Example:
firebase_admin.initialize_app(cred, {
'storageBucket': 'myapp.appspot.com'
})
Upvotes: 1
Reputation: 185
For me, the issue was that I used different accounts for the server account and runtime server account when creating the cloud function. I didn't understand that they could be the same, so changing permissions in only one of them of course did not work.
Go to "cloud functions" -> click the name of the function -> "edit" -> set the service account under the "Eventarc trigger" -> "service account" to the same email address as for the service account in "Runtime, build, connections and security settings" -> "Runtime" -> "runtime service account".
Upvotes: 0
Reputation: 615
As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.
Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:
That's it. All your python application started as that user, will use this as default credential for storage buckets interaction.
Happy GCP'ing :)
Upvotes: 2
Reputation: 789
This is what worked for me when the google documentation didn't work. I was getting the same error with the appropriate permissions.
import pathlib
import google.cloud.storage as gcs
client = gcs.Client()
#set target file to write to
target = pathlib.Path("local_file.txt")
#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"
#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:
#download and write file locally
client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)
Upvotes: 0
Reputation:
This whole things wasn't working because I didn't have permission storage admin
in the service account that I am using in GCP.
Allowing storage admin
to my service account solved my problem.
Upvotes: 9
Reputation: 92
This question is more appropriate for a support case.
As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.
Upvotes: 0