Reputation: 667
I've two google projects; the first one produces the data thru hadoop and the data is on gs:/bucket. The second project will be used by the marketing team to query that data via bigquery.
Creation of the table on the second proj and the data load appears to be fine. Though there is large volume of data in there, the load job says no data on the bucket.
What we are trying is doable?
I can get the compute engines access to buckets on different project by simply granting access to their service accounts but in this case, not sure it is just permission issue or not!
thanks in advance
Upvotes: 2
Views: 495
Reputation: 26617
This sounds like a permissions issue. The account that is running the load job must have read access to the bucket and all of the files in that bucket.
Looking at your job, however, it looks like you need to add a *
to the end of your source URI so that BigQuery knows to import all of those files.
Upvotes: 2