Reputation: 1476
My cloud functions, CF (using service account credentials, [email protected]
) is unable to fetch BQ query results, when query is initiated on app running on AppEngine (using App Engine Default service account credentials, [email protected]
)
File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 293, in api_request raise exceptions.from_http_response(response) google.api_core.exceptions.Forbidden: 403 GET https://www.googleapis.com/bigquery/v2/projects/[PROJECT]/queries/[BQ-JOB-ID]?maxResults=0: Access Denied:Dataset [TEMP-DATASET-ID-STORING-QUERY-RESULTS]: The user [SERVICE-ACCOUNT-EMAIL] does not have permission to access results of another user's job. ```
The whole thing has 2 parts (everything using PY3.x Client Libraries)
PART A: running on AppEng Flexible env. (Py3.x) initiates a BQ query job.insert
. The request ends by publishing this job_id to Pubsub topic.
PART B: A Cloud function (python37 runtime, triggering on this Pubsub event):
PubsubMessage
job.done
On STEP4 I’m getting above mentioned error (obtained rrom Stackdriver Logging)
I acknowledge even though the
[email protected]
[email protected]
are different, but both Google Account emails have Project Edit level permissions, and thus expected CF to be able to access the query Job results! More so, when CF (using Service Account credentials) are able to pass STEP2 (job.get, polling job status until it’s DONE), it's only the retrieval of query result (STEP4) is throwing the error
Any guidance will be highly appreciable!
Upvotes: 1
Views: 2285
Reputation: 4384
If you're running job and consuming results with different identities, persist the results to a named destination table. You can setup a designated dataset with a short TTL so tables are automatically removed after that time. Cached/anonymous results are by default restricted to the query creator.
An example of constructing a query with a destination table can be found in the BigQuery docs.
Upvotes: 3