MattiaPance
MattiaPance

Reputation: 31

How to use pandas-gbq with BigQuery Storage API within AI platform training?

I'm submitting a training job to the GCP AI platform training service. My training dataset (around 40M rows on a BigQuery table in the same GCP project) needs to be preprocessed at the beginning of the training job as a pandas dataframe, so I tried both the solutions proposed by the GCP documentation:

Both methods work on a AI platform notebook VM, downloading the whole 40M rows dataset as a pandas dataframe in few minutes. I'm struggling to replicate the same procedure on the AI platform training server (that runs on a n1-highmem-16 machine). In the case of pandas-gbq API I obtain a permission denied error:

google.api_core.exceptions.PermissionDenied: 403 request failed: the user does not have bigquery.readsessions.create' permission for 'projects/acn-c4-crmdataplatform-dev'

In the case of the google-cloud-bigquery API there are no errors.

Here is the list of the required package that I, as suggested by the GCP documentation, pass to the AI platform training job with a setup.py file in the trainer package:

Upvotes: 3

Views: 1291

Answers (1)

guillaume blaquiere
guillaume blaquiere

Reputation: 75940

You have to do 2 things:

  • First, check that service account service-<PROJECT_NUMBER>@cloud-ml.google.com.iam.gserviceaccount.com exist and has the Cloud ML Service Agent role. If not, add it manually (you don't have to create it!)
  • Grant this service account the permission to query your BigQuery dataset.

Upvotes: 2

Related Questions