Reputation: 1
I want to run dataflow job to migrate data from google-project-1-table to google-project-2-table. (Read from one and write to another). I am getting permission issue while doing that. I have set "GOOGLE_APPLICATION_CREDENTIALS" to point to my credential file for project-1. In project-2 below are the permissions/roles for project-1. 1) service-account (role - Editor) 2) [email protected] (role - Editor) 3) @cloudservices.gserviceaccount.com(role - Editor).
Is there anything else I need to do to run the job?
Caused by: com.google.bigtable.repackaged.com.google.cloud.grpc.io.IOExceptionWithStatus: Error in response stream at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResultQueueEntry$ExceptionResultQueueEntry.getResponseOrThrow(ResultQueueEntry.java:66) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResponseQueueReader.getNextMergedRow(ResponseQueueReader.java:55) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamingBigtableResultScanner.next(StreamingBigtableResultScanner.java:42) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamingBigtableResultScanner.next(StreamingBigtableResultScanner.java:27) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResumingStreamingResultScanner.next(ResumingStreamingResultScanner.java:89) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResumingStreamingResultScanner.next(ResumingStreamingResultScanner.java:45) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$1.next(CloudBigtableIO.java:221) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$1.next(CloudBigtableIO.java:216) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$Reader.advance(CloudBigtableIO.java:775) at com.google.cloud.bigtable.dataflow.CloudBigtableIO$Reader.start(CloudBigtableIO.java:799) at com.google.cloud.dataflow.sdk.io.Read$Bounded$1.evaluateReadHelper(Read.java:178) ... 18 more Caused by: com.google.bigtable.repackaged.io.grpc.StatusRuntimeException: PERMISSION_DENIED: User can't access project: project-2 at com.google.bigtable.repackaged.io.grpc.Status.asRuntimeException(Status.java:431) at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamObserverAdapter.onClose(StreamObserverAdapter.java:48) at com.google.bigtable.repackaged.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$3.runInContext(ClientCallImpl.java:462) at com.google.bigtable.repackaged.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:54) at com.google.bigtable.repackaged.io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:154) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ... 1 more
Upvotes: 0
Views: 2181
Reputation: 6023
There are some instructions for this in the section "Accessing Cloud Platform Resources Across Multiple Cloud Platform Projects" of the Dataflow Security and Permissions guide.
Since that guide does not explicitly address Cloud BigTable, I will try to write up the requirements clearly here in terms of your question.
Using fake project id numbers, it seems you have:
project-1
with id 12345.project-2
with id 9876google-project-1-table
in project-1
google-project-2-table
in project-2
project-1
, which you want to:
google-project-1-table
google-project-2-table
Is that accurate?
Your Dataflow workers that write to Bigtable run as the compute engine service account. That is [email protected]
. This account will need to be able to access project-2
and write to google-project-2-table
.
Your error message implies that the permissions failure occurs at the coarsest granularity - the account cannot access project-2
at all.
Upvotes: 2