Reputation: 93
I have more then 500 files which I am uploading into gcp storage in every 10 minutes. I am uploading these files one-by-one which is a time taken process.
So Is is possible to upload these files into bulk in one short on gcp ?
def upload_files(**kwargs):
stock_codes = kwargs['data']
bucket_name = 'gcp-bucket-name'
base_path = 'base_path'
storage_client = storage.Client.from_service_account_json(gcp-services.json")
bucket = storage_client.bucket(bucket_name)
for filename in stock_codes:
svg_blob_name = f"{filename}.svg"
svg_blob = bucket.blob(svg_blob_name)
svg_blob.upload_from_filename('file_path')
print(
"File {} uploaded to {}.".format(
f"{filename}", svg_blob_name
)
)
Upvotes: 0
Views: 1752
Reputation: 2552
yes. i don't know which technology u use but it can achieved by
using gsutil for example. try the following command -
gsutil -m cp -r dir gs://my-bucket
the -m flag here is enabling parallel copy
for more info checkout this link - https://cloud.google.com/storage/docs/gsutil/commands/cp
Upvotes: 1