Reputation: 161
I have the below cloud function which copies files from one bucket to another in the same project and is working fine. I need to add a timestamp to the files in the destination bucket so that they are not overwritten next time.
from google.cloud import storage, bigquery
def archive(event, context):
source_bucket_name = event['bucket']
blob_name = event['name']
# Initiate Cloud Storage client
storage_client = storage.Client()
bucketName = 'test_vs'
# Define the origin bucket
origin = storage_client.bucket(bucketName)
# Define the destination bucket
destination = storage_client.bucket('test_vs_archive')
# Get the list of the blobs located inside the bucket which files you want to copy
blobs = storage_client.list_blobs(bucketName)
for blob in blobs:
origin.copy_blob(blob, destination)
return "Done!"
Upvotes: 0
Views: 1244
Reputation: 779
You can read this document for API
copy_blob(blob, destination_bucket, new_name=None,....)
Parameters:
- new_name (str) – (Optional) The new name for the copied file.
So you just need to add Parameters "new_name" to slove:
from google.cloud import storage, bigquery
from datetime import datetime
def archive(event, context):
source_bucket_name = event['bucket']
blob_name = event['name']
# Initiate Cloud Storage client
storage_client = storage.Client()
bucketName = 'test_vs'
# Define the origin bucket
origin = storage_client.bucket(bucketName)
# Define the destination bucket
destination = storage_client.bucket('test_vs_archive')
# Get the list of the blobs located inside the bucket which files you want to copy
blobs = storage_client.list_blobs(bucketName)
# Get datetime
dt = datetime.today().strftime("%Y-%m-%d")
for blob in blobs:
n_name = f'{blob.name}_{dt}' # use fstring
origin.copy_blob(blob, destination, new_name=n_name)
return "Done!"
Upvotes: 2