Reputation: 155
I have a VM with 60GB and 45GB is used. It has multiple users.
I have deleted old users and now I have about 40GB used. I want to download all the data to my machine. How do I do that? The only way I know is with scp
the command is something like this one: gcloud compute scp --recurse <$USER>@<INSTANCE>:/home/$USER ~/$LOCAL_DIR
but it's very slow and might even take a couple of days. Is there a better and easier way to download the data?
Upvotes: 0
Views: 1164
Reputation: 2673
The bottleneck here seems to be that you're trying to download a considerable amount of data over SSH which is quite slow.
One thing you can try to speed up the process is break down the download into two parts:
So in that case, from the VM you'll execute:
gsutil cp -R /home/$USER gs://BUCKET_NAME
And then from your local machine:
gsutil cp -R gs://BUCKET_NAME ~/$LOCAL_DIR
Gsutil uses parallel composite uploads to speed up uploading large files, in case you have any. And in the first step you'll be doing GCP <-> GCP communication which will be faster than downloading from SSH directly.
Upvotes: 3