Reputation: 1312
I intend to copy a whole directory with all files and directories recursively from one Google Cloud Storage Bucket to another Google Cloud Storage Bucket.
The following code works fine from local to a Google Cloud Storage Bucket :
import glob
from google.cloud import storage
def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
assert os.path.isdir(local_path)
for local_file in glob.glob(local_path + '/**'):
if not os.path.isfile(local_file):
upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
else:
remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
blob = bucket.blob(remote_path)
blob.upload_from_filename(local_file)
upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)
How can I copy a directory recursively from one bucket to another in the same project ?
Upvotes: 1
Views: 2510
Reputation: 812
Though rsync
is already pointed out in the comments, let me add one more point.
If you have a large number of directories and you can increase the speed by using -m
. This helps in parallelization. -r
helps in recursive copying when there are folder structures, that need to be copied.
So with rsync
you can use it this way:-
gsutil -m rsync -r gs://source_bucket gs://destination_bucket
Please refer this public doc for reference
Upvotes: 4