Reputation: 17
I have implemented the following code, but whenever I run the Google function HTTP trigger, I get the "Error: Could not handle the request" error. I checked the log file to see the error (below), but just above it is the code I'm trying to run to get a key. I want to retrieve a key, which I can test out using it to connect to a storage bucket - is that even possible??? It's a way of giving someone a key so they can connect to bucket and upload some data. I'm trying to replicate a SAS that Azure uses in order to give access to a storage account.
service_account_email = '[email protected]'
credentials = service_account.Credentials.from_service_account_file(
filename=os.environ['GOOGLE_APPLICATION_CREDENTIALS'],
scopes=['https://www.googleapis.com/auth/cloud-platform'])
service = googleapiclient.discovery.build(
'iam', 'v1', credentials=credentials)
key = service.projects().serviceAccounts().keys().create(
name='projects/-/serviceAccounts/' + service_account_email, body={}
).execute()
newrequest 6dox145ag5e0 Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v2.py", line 402, in run_http_function result = _function_handler.invoke_user_function(flask.request) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v2.py", line 268, in invoke_user_function return call_user_function(request_or_event) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v2.py", line 261, in call_user_function return self._user_function(request_or_event) File "/user_code/main.py", line 33, in hello_world filename=os.environ['GOOGLE_APPLICATION_CREDENTIALS'], File "/env/lib/python3.7/os.py", line 681, in getitem raise KeyError(key) from None KeyError: 'GOOGLE_APPLICATION_CREDENTIALS'
Upvotes: 0
Views: 152
Reputation: 75800
You can do this. But you mustn't do this!
Many points:
@appspot.gserviceaccount.com
is the App Engine default service account and have, by default, the editor role. A lot of permission and can do almost anything on the project. If you give this to someone on internet, it can break your project or even create a lot on VMs with bitcoin miner on them. And you will pay!The correct pattern is:
Update 1:
To achieve what you want, you need 2 functions:
The first one that provide the signedUrl. Deploy your function in private mode --no-allow-unauthenticated
(or remove allUsers from the Cloudfunctions.invoker role). Then, when you test, you can do this curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https:...
. If you use an external tool, generate a token with the command gcloud auth print-access-token
, copy it and add it to the header to your requests. It is valid 1H.
The second function will be triggered by Cloud Storage event. When a file is uploaded (finalized), the function will be invoked. Catch the event, get the file metadata from the event (bucket name + file path) and store it in a database.
Because, I don't know what you want to do with your file metadata, I can't recommend a database.
(there are the 3 most affordable)
Upvotes: 1