Reputation: 415
I just got a Google Cloud Platform Deep Learning VM running and opened JupyterLab on my localhost. Now I can use the VM and my directory is /home/jupyter
. I am running a program that exports results to csv's. I can save them on the VM in a separate folder, and then right-click and download them (not on the folder, but on the files). However, I'll have 10'000 files (at 2kb), how can I export those files either directly to a local directory (preferred, I use Mac), or a GCP Storage Bucket, or my Google Drive?
Upvotes: 0
Views: 1449
Reputation: 4961
I understand taht you are using a Compute Engine isntance. In that case if your are not using a Windows bsaed machine you can run the gcloud compute scp
command.
In your case to copy an entire folder recursively you can use:
gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe
To copy a folder called "narnia" in your Compute Engine instance to a folder named wardrobe in your local machine
Upvotes: 2