abhi
abhi

Reputation: 21

Google Cloud Dataflow Error | apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing

I have developed a wordcount pipeline using apache beam and I am able to run the python code locally on my machine. But while trying to do that on Dataflow getting this error.

apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing https://dataflow.googleapis.com/v1b3/projects/mw-da-training/locations/%3Dus-central/jobs?alt=json: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 27 May 2022 11:56:56 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'alt-svc': 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '158', '-content-encoding': 'gzip'}>, content <{
"error": { "code": 403, "message": "Permission denied on 'locations/=us-central' (or it may not exist).", "status": "PERMISSION_DENIED" } }

Upvotes: 1

Views: 1114

Answers (1)

Pablo
Pablo

Reputation: 11041

You likely need to authenticate your pipeline with Google Cloud. There's a few ways of doing this:

Using GOOGLE_APPLICATION_CREDENTIALS

This is an environment variable that Google applications use to authenticate with Google Cloud. You would:

  1. Download a service account key and store as a file (e.g. my_credential.json).
  2. Point the GOOGLE_APPLICATION_CREDENTIALS variable to this file (i.e. export GOOGLE_APPLICATION_CREDENTIALS=/path/to/my_credentials.json).
  3. You're good to go. Run your pipeline!

Using gcloud to log in

  1. Log in to your own user account using gcloud auth application-default login. This will set up the application default login for your session.
  2. You're good to go. Run your pipeline!

Upvotes: 1

Related Questions