Stan
Stan

Reputation: 1

Load multiple csv from Google Cloud Storage to multiple table in BigQuery

I have 20 csv file stored in Google Cloud Storage in 1 folder and i would like to to create the tables in BigQuery for each of the tables. The 20 csv files are all named differently, I try to automate the load process so i avoid using the BigQuery UI to load it one by one.

Is there anyway i can load and create all 20 tables at the same time ?

I tried the following in Cloud Shell but obviously it only load 1 table at a time. Any help much appreciated !

 bq load \
    --autodetect \
    --source_format=CSV \
    --replace \
    sandbox.Bq_load_Test \
    gs://bucket/folder/xxx.csv

Upvotes: 0

Views: 1050

Answers (1)

Cylldby
Cylldby

Reputation: 1978

Inspired by this answer

gsutil ls gs://bucket/folder/*.csv  | \
awk '{{n=split($1,A,"/"); q=split(A[n],B,"."); print "YOURPROJECT:YOURDATASET."B[1]" "$0}}' |\
xargs -I{{}} sh -c 'bq --location=YOURLOCATION load --replace=true --autodetect --source_format=CSV {{}}'

Basically this looks into your bucket, lists all CSV files there, and for each one of them extract the name and loads the CSV in a table with the same name in the dataset of your choosing.

Upvotes: 0

Related Questions