Reputation:
I have tried using this code, but it didn't work.
gcloud compute ssh [email protected] --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
I tried even copying files from the server via http request too, no results. I have made the files available to public and tried this code.
gsutil cp http://example.com/dir1/ gs://MY_BUCKET
I am getting errors as such the http is unrecognized.
I also have the ssh access to the remote server.
Upvotes: 0
Views: 7420
Reputation: 1294
Combining the other answers, to copy a file from a Compute Engine instance to Google Cloud Storage, you need the following command:
gcloud compute scp --zone [ZONE] [USER]@[INSTANCE_NAME]:/path/to/file /dev/stdout | gsutil cp - gs://[BUCKET_NAME]/[FILE_NAME]
If the file is located in a remote server other than a Compute Engine instance, you can use "scp" as Nicholas said.
Upvotes: 1
Reputation: 41
gsutil cp
allows you to stream in data it receives from stdin per documentation here: https://cloud.google.com/storage/docs/gsutil/commands/cp Pasted below:
Streaming Transfers Use '-' in place of src_url or dst_url to perform a streaming transfer. For example:
long_running_computation | gsutil cp - gs://my-bucket/obj Streaming uploads using the JSON API (see gsutil help apis) are buffered in memory part-way back into the file and can thus retry in the event of network or service problems.
Using this, we can stream the scp copy from your remote server to standard out (if you're on linux or mac) and pipe it into the gsutil cp like this:
scp <USER>@<YOUR_SERVER>:/<PATH>/<FILE_NAME> /dev/stdout | gsutil cp - gs://<BUCKET_NAME>/<FILE_NAME>
Upvotes: 2
Reputation: 1651
What you should be doing is not a regular copy, but an scp - secured copy.
gcloud compute scp
securely copies files between a virtual machine instance and your local machine using the scp command.
More information on all available switches and descriptions can be found on online docs
Upvotes: 1