Reputation: 1909
I retrieve data from google workspace APIs. I authenticate to those APIs with Service Account from a Dataproc cluster.
I have two ways to authenticate with my Service Account. Either I use a JSON key file to authenticate with my SA SA-with-keyfile
or I use the default SA of my Dataproc cluster : SA-default
.
Both SA are authorized to access the data and I provided them with the same scopes. Here is a sample of the code generating the Google Credentials :
import com.google.auth.oauth2.GoogleCredentials;
import com.google.auth.oauth2.ServiceAccountCredentials;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
/**
* Generate GoogleCredentials from config (minimal code example)
*/
public class ServiceAccountUtil {
// Generate GoogleCredentials from a Service Account JSON key file
public static GoogleCredentials getCredentialsWithKey(String scopes, String privateKeyJson) throws IOException {
ServiceAccountCredentials serviceAccountCredentials;
try (InputStream stream = new ByteArrayInputStream(privateKeyJson.getBytes())) {
serviceAccountCredentials = ServiceAccountCredentials.fromStream(stream);
}
return serviceAccountCredentials.createScoped(scopes.split(" "));
}
// Generate GoogleCredentials using the default Service Account
public static GoogleCredentials getCredentialsDefault(String scopes) throws IOException {
return ServiceAccountCredentials.getApplicationDefault().createScoped(scopes.split(" "));
}
}
When using the SA SA-with-keyfile
, everythink works fine and I retrieve my data.
However, when using the SA-default
, the API answers with :
{
"error": {
"code": 403,
"message": "Request had insufficient authentication scopes.",
"errors": [
{
"message": "Insufficient Permission",
"domain": "global",
"reason": "insufficientPermissions"
}
],
"status": "PERMISSION_DENIED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "ACCESS_TOKEN_SCOPE_INSUFFICIENT",
"domain": "googleapis.com",
"metadata": {
"service": "admin.googleapis.com",
"method": "ccc.hosted.frontend.directory.v1.DirectoryGroups.List"
}
}
]
}
}
I don't understand why I am getting this error in one case (SA without JSON key file) since I am using the same scopes in both cases.
Upvotes: 0
Views: 1367
Reputation: 75745
When you create a cluster, you use the compute engine default service account. When you use compute engine default service account on a VM, you have limited scope by default. That limited scope doesn't apply if you use a custom service account or another service account. (That explains why it doesn't work with the default service account, and works with your service account key file)
On Dataproc, you have the capacity to allow all the Google Cloud scope on your cluster, in the security part during the cluster duration:
Upvotes: 1