Reputation: 2331
I am trying to use Google Cloud PubSub with my Google Cloud Dataproc cluster and I am getting authentication scope errors like the following:
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Request had insufficient authentication scopes.",
"reason" : "forbidden"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
How can I resolve this issue so I can use PubSub (and other Google Cloud) products in my Spark/Hadoop projects running on Cloud Dataproc?
Upvotes: 3
Views: 1873
Reputation: 2331
Google Cloud Dataproc includes some authentication scopes by default but does not presently include scopes for all Google Cloud Platform products. You can add scopes to a cluster by creating them with the Google Cloud SDK and using the --scopes
flag.
For example, you can use the following flag when using the gcloud beta dataproc clusters create
command to add the PubSub scope --scopes https://www.googleapis.com/auth/pubsub
. As long as the service handles the "catch all" scope, you can use --scopes https://www.googleapis.com/auth/cloud-platform
to add scopes for many services at once.
You can find more information about authentication and authorization on the Google Cloud Platform docs.
Upvotes: 4