Reputation: 31
Complete newbie here with regards Google CLoud Platform and gsutil so sorry if this question is that simple.
I have 1000 images (all jpg) stored in the same local folder (windows) on my pc that I want to upload to my Google Bucket. I know how to upload objects one at a time but is there an easy bit of code to upload all at same time to root directory of bucket.
Ideally I need them to be cache-control to 3600 and also public. Any help with the gsutil command would be great
2nd question is these images will in the future change, is there a command to then upload the folder of images again to replace the ones in the bucket.
Thanks for any help. I have spent days looking through web to get a simple answer to this
Upvotes: 0
Views: 682
Reputation: 617
This gsutil command should work (at least it does on mac):
gsutil -m -h "Cache-Control:public, max-age=3600" cp '*' gs://my-bucket
I added '-m' to multi-thread the upload, and '-h "Cache-Control:public, max-age=3600"' to set the cache-control header. https://cloud.google.com/storage/docs/gsutil/commands/cp
You should be able to use the rsync command to sync it back up later. https://cloud.google.com/storage/docs/gsutil/commands/rsync
gsutil -m rsync -r -d -n dir-to-sync gs://my-bucket
The '-d' option enables deleting objects from the destination bucket that have been removed from the local directory. '-r' makes it recurse into the directory. Lastly '-n' makes it a no-op, it just prints what would be changed. You'll need to remove this flag in order to actually make the change, but I strongly recommend running it with the flag enabled first in order to avoid accidentally deleting everything in the destination bucket.
Instructions on installing gsutil can be found here. https://cloud.google.com/storage/docs/gsutil_install
Upvotes: 1