dex
dex

Reputation: 63

google big query: export table to own bucket results in unexpected error

I'am stuck trying to export a table to my google cloud storage bucket.

Example job id: job_0463426872a645bea8157604780d060d

I tried the cloud storage target with alot of different variations, all reveal the same error. If I try to copy the natality report, it works.

What am I doing wrong?

Thanks!

Daniel

Upvotes: 3

Views: 4035

Answers (3)

pedro
pedro

Reputation: 41

I was able to do it with:

bq extract --destination_format=NEWLINE_DELIMITED_JSON myproject:mydataset.mypartition gs://mybucket/mydataset/mypartition/{*}.json

Upvotes: 0

user1302884
user1302884

Reputation: 843

Specify the file extension along with the pattern. Example

gs://foo/bar/baz*.gz in case of GZIP (compressed)

gs://foo/bar/baz*.csv in case of csv (uncompressed)

The foo directory is the bucket name and bar directory can be your date in string format which could be generated on the fly.

Upvotes: 2

Jordan Tigani
Jordan Tigani

Reputation: 26617

It looks like the error says: "Table too large to be exported to a single file. Specify a uri including a * to shard export." Try switching the destination URI to something like gs://foo/bar/baz*

Upvotes: 4

Related Questions