zyxue
zyxue

Reputation: 8820

Efficient way upload multiple files to separate locations on google cloud storage

Here is my case, I'd like to copy multiple files to separate locations on google cloud storage, e.g:

gsutil -m cp /local/path/to/d1/x.csv.gz gs://bucketx/d1/x.csv.gz
gsutil -m cp /local/path/to/d2/x.csv.gz gs://bucketx/d2/x.csv.gz
gsutil -m cp /local/path/to/d3/x.csv.gz gs://bucketx/d3/x.csv.gz
...

I have over 10k of such files, and executing them by separate calls of gsutil seems to be really slow, and lots of time is wasted on setting up network connection. What's the most efficient way to do that, please?

Upvotes: 0

Views: 4132

Answers (1)

Travis Hobrla
Travis Hobrla

Reputation: 5511

If your paths are consistently of the nature in your example, you could do this with one gsutil command:

gsutil -m cp -r /local/path/to/* gs://bucketx

However, this only works if you want the destination naming to mirror the source. If your paths are arbitrary mappings of source name to destination name, you'll need to run individual commands (which as you note can be sped up with parallel).

Upvotes: 2

Related Questions