Jonny Davy
Jonny Davy

Reputation: 55

Uploading all files within directory to Google Cloud bucket w/ Python

I have automated a process to pull XML files from multiple devices into my box, within the locations of c:\scripts\googlebucket\uploads

My task is to upload all the files within this directory to a Google Cloud Bucket. A credentials JSON file was also provided to access the Google Cloud location.

I have been playing around with uploading singular files with success, but I'm struggling to figure out how I can dump all files within a specific directory (All files within c:\scripts\googlebucket\uploads to google cloud bucket).

Here is my script that I've been launching to push the singular file:

from google.cloud import storage
client = storage.Client.from_service_account_json(json_credentials_path='Service_Account.json')
bucket = client.get_bucket('us-client-uploads')
blob = bucket.blob(f"Closed_24.02.2021.08.24.xml")
blob.upload_from_filename(f"Closed_24.02.2021.08.24.xml")

Would anyone happen to have any suggestions? Much appreciated.

Upvotes: 1

Views: 1261

Answers (1)

yvesonline
yvesonline

Reputation: 4857

As far as I know the Google Storage Python library doesn't provide anything helping with that so you'll need to iterate over the files in the directory in Python and use the upload_from_filename method you're already using, something like that:

from os import listdir
from os.path import isfile, join
folder = '<your folder>'
files = [f for f in listdir(folder) if isfile(join(folder, f))]
for file in files:
    local_file = folder + file
    blob = bucket.blob(file)
    blob.upload_from_filename(local_file)

Upvotes: 3

Related Questions