Reputation: 440
I'm implementing a Cloud Dataflow job on GCP that needs to deal with 2 GCP projects. Both input and output are Bigquery partitionned tables. The issue I'm going through now is that I must read data from a project A and write it into a project B.
I havent seen anything related to cross project service accounts and I can't give Dataflow two different credential key either which is a bit annoying ? I don't know if someone else went through that kind of architecture or how you dealt with it.
Upvotes: 0
Views: 2344
Reputation: 10974
I think you can accomplish this with the following steps:
Upvotes: 4
Reputation: 237
It is very simple. you need to give required permission/access to your service account from both the project.
So you need only service account which has required access/permission in both the project
Hope it helps.
Upvotes: 1