Adriano Matos
Adriano Matos

Reputation: 387

Upload a folder to Google Cloud Storage with Python?

I found upload_from_file and upload_from_filename, but is there a function or method to upload an entire folder to Cloud Storage via Python?

Upvotes: 3

Views: 4462

Answers (4)

Jo Geo
Jo Geo

Reputation: 45

This is an improvement over the answer provided by @Maor88

This function can be used to upload a file or a directory to gcs.

from google.cloud import storage
import os
import glob


def upload_to_bucket(src_path, dest_bucket_name, dest_path):
        bucket = storage_client.get_bucket(dest_bucket_name)
        if os.path.isfile(src_path):
            blob = bucket.blob(os.path.join(dest_path, os.path.basename(src_path)))
            blob.upload_from_filename(src_path)
            return
        for item in glob.glob(src_path + '/*'):
            if os.path.isfile(item):
                blob = bucket.blob(os.path.join(dest_path, os.path.basename(item)))
                blob.upload_from_filename(item)
            else:
                upload_to_bucket(item, dest_bucket_name, os.path.join(dest_path, os.path.basename(item)))

Upvotes: 2

Maor88
Maor88

Reputation: 199

this works for me. copy all content from local directory to a specific bucket-name/ full-path (recursive) in google cloud storage:

import glob
from google.cloud import storage
import os

def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
    assert os.path.isdir(local_path)
    for local_file in glob.glob(local_path + '/**'):
        if not os.path.isfile(local_file):
           upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
    else:
        remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
        blob = bucket.blob(remote_path)
        blob.upload_from_filename(local_file)


upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)

Upvotes: 2

Dustin Ingram
Dustin Ingram

Reputation: 21580

Google Cloud Storage doesn't really have the concept of "directories", just binary blobs (that might have key names that sort of look like directories if you name them that way). So your current method in Python is appropriate.

Upvotes: 0

Gabe Weiss
Gabe Weiss

Reputation: 3342

I don't think directly in the Python API, no, but there is in the commandline tool gsutil. You could do a system call from the python script to call out to the gsutil tool as long as you're authenticated on commandline in the shell you're calling the Python from.

The command would look something like:

gsutil -m cp -r <foldername> gs://<bucketname>

Upvotes: 0

Related Questions