Reputation: 71
I've found instructions how to generate credentials for the project level but there aren't clear instructions on adding a service account to only a specific dataset using the cli.
I tried creating the service account:
gcloud iam service-accounts create NAME
and then getting the dataset:
bq show \
--format=prettyjson \
project_id:dataset > path_to_file
and then adding a role to the access section
{
"role": "OWNER",
"userByEmail": "[email protected]"
},
and then updating it. It seemed to work because I was able to create a table but then I got an access denied error User does not have bigquery.jobs.create permission in project
when I tried loading data into the table.
When I inspected the project in the cloud console, it seemed as if my service account was added to the project rather then the dataset, which is not what I want but also does not explain why I don't have the correct permissions. In addition to owner permissions I tried assigning editor permission and admin, neither of which solved the issue.
Upvotes: 0
Views: 2383
Reputation: 7287
It is not possible for a service account to only have permissions on a dataset level and then run a query. When a query is invoked, it will create a job. To create a job, the service account to be used should have permission bigquery.jobs.create
added at a project level. See document for required permissions to run a job.
With this in mind, it is required to add bigquery.jobs.create
at project level so you can run queries on the shared dataset.
NOTE: You can use any of the following pre-defined roles as they all have bigquery.jobs.create
.
With my example I used roles/bigquery.user. See steps below:
bq show --format=prettyjson my-project:mydataset > info.json
info.json
{ "role": "OWNER", "userByEmail": "[email protected]" },
bq update --source info.json my-project:mydataset
gcloud projects add-iam-policy-binding myproject --member=serviceAccount:[email protected] --role=roles/bigquery.jobUser
Upvotes: 1