jpennell
jpennell

Reputation: 651

How do I configure Google BigQuery command line tool to use a Service Account?

I've created a service account using the Google API Console and wish to use this service account with the Google BigQuery CLI (bq) tool.

I've been using the command line tool to successfully access the BigQuery service using my valid OAuth2 credentials in ~/.bigquery.v2.token, however I can't seem to find any documentation on how to modify this file (or otherwise configure the tool) to use a service account instead.

Here is my current .bigquery.v2.token file

{
    "_module": "oauth2client.client",
    "_class": "OAuth2Credentials",
    "access_token": "--my-access-token--",
    "token_uri": "https://accounts.google.com/o/oauth2/token",
    "invalid": false,
    "client_id": "--my-client-id--.apps.googleusercontent.com",
    "id_token": null,
    "client_secret": "--my-client-secret--",
    "token_expiry": "2012-11-06T15:57:12Z",
    "refresh_token": "--my-refresh-token--",
    "user_agent": "bq/2.0"
}

My other file: ~/.bigqueryrc generally looks like this:

project_id = --my-project-id--
credential_file = ~/.bigquery.v2.token

I've tried setting the credential_file paramater to the .p12 private key file for my service account but with no luck, it gives me back the following error

******************************************************************
** No OAuth2 credentials found, beginning authorization process **
******************************************************************

And asks me to go to a link in my browser to set up my OAuth2 credentials again.

The command line tools' initial configuration option "init":

bq help init

displays no helpful information about how to set up this tool to use a service account.

Upvotes: 16

Views: 23182

Answers (5)

Michael Delgado
Michael Delgado

Reputation: 15452

For anyone else who comes along struggling to use bq with a service account... I had a seriously hard time getting this to work inside of a CI/CD pipeline using the Google Cloud SDK docker images on gitlab-ci. Turns out the missing bit for me was making sure to set the default project. On my laptop gcloud was happy inferring the default project from the service account, but for some reason the version within the docker image was defaulting to a public free project.

- gcloud auth activate-service-account --key-file=${PATH_TO_SVC_ACCT_JSON};
- gcloud config set project ${GOOGLE_BIGQUERY_PROJECT}

after this I was able to use the bq utility as the service account. I imagine setting the default project in the .bigqueryrc file does the trick too, which is why the OP didn't run into this issue.

Upvotes: 0

Daniel
Daniel

Reputation: 664

1.) Tell gcloud to authenticate as your service account

gcloud auth activate-service-account \
[email protected] \
--key-file=/path/key.json \
--project=testproject

2.) Run a bq command as you would with your user account

# ex: bq query
bq query --use_legacy_sql=false 'SELECT CURRENT_DATE()'

3. optional) Revert gcloud authentication to your user account

gcloud config set account [email protected]

3a. optional) See who gcloud uses for authentication

gcloud auth list

Upvotes: 9

scott
scott

Reputation: 235

The bq authorization flags are now deprecated

bq documentation

Upvotes: 5

jpennell
jpennell

Reputation: 651

I ended up finding some documentation on how to set this up

$ bq --help

....

--service_account: Use this service account email address for authorization. For example, [email protected].
(default: '')

--service_account_credential_file: File to be used as a credential store for service accounts. Must be set if using a service account.

--service_account_private_key_file: Filename that contains the service account private key. Required if --service_account is specified.
(default: '')

--service_account_private_key_password: Password for private key. This password must match the password you set on the key when you created it in the Google APIs Console. Defaults to the default Google APIs Console private key password.
(default: 'notasecret')

....

You can either set these specifically on each bq (bigquery commandline client) request, ie:

$ bq --service_account --my-client-id--.apps.googleusercontent.com -- service_account_private_key_file ~/.bigquery.v2.p12 ... [command]

Or you can set up defaults in your ~/.bigqueryrc file like so

project_id = --my-project-id--
service_account = [email protected]
service_account_credential_file = /home/james/.bigquery.v2.cred
service_account_private_key_file = /home/james/.bigquery.v2.p12

The service account can be found in the Google API Console, and you set up service_account_private_key_password when you created your service account (this defaults to "notasecret").

note: file paths in .bigqueryrc had to be the full path, I was unable to use ~/.bigquery...

Some additional dependencies were required, you will need to install openssl via yum/apt-get

--yum--
$ yum install openssl-devel libssl-devel

--or apt-get--
$ apt-get install libssl-dev

and pyopenssl via easy install/pip

--easy install--
$ easy_install pyopenssl

--or pip--
$ pip install pyopenssl

Upvotes: 11

Michael Sheldon
Michael Sheldon

Reputation: 2057

The bq tool requires two configuration files, controlled by the --bigqueryrc and the --credential_file flag. If neither one is found, bq will attempt to automatically initialize during start up.

To avoid this for the --bigqueryrc file, you can place a ".bigqueryrc" file in the default location, or override it with --bigqueryrc to some writeable file path.

Upvotes: 1

Related Questions