Reputation: 45
I have a script which downloads certain images (<1 MB) from web and saves into local disk. Can I save it into a Google Cloud Storage bucket instead of my local system?
def downloadImage(url):
try:
print("Downloading {}".format(url))
image_name = str(url).split('/')[-1]
resp = urlopen(url)
image = np.asarray(bytearray(resp.read()), dtype="uint8")
image = cv2.imdecode(image, cv2.IMREAD_COLOR)
cv2.imwrite(current_path + "\\Downloaded\\" + image_name, image)
except Exception as error:
print(error)
Upvotes: 0
Views: 1373
Reputation: 556
You can employ the GCS Python Client Library to programatically perform tasks related to GCS. The following code will upload a local file located at /PATH/TO/SOURCE_FILE
to the GCS bucket gs://BUCKET_NAME
from google.cloud import storage
def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket"""
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
print(
"File {} uploaded to {}.".format(
source_file_name, destination_blob_name
)
)
BUCKET_NAME = "BUCKET_NAME"
SOURCE_FILE_NAME = "/PATH/TO/SOURCE_FILE"
DESTINATION_FILE_NAME = "DESTINATION_FILE"
upload_blob(BUCKET_NAME, SOURCE_FILE_NAME, DESTINATION_FILE_NAME)
Keep in mind that, in order to use the upload_blob
method, it is necessary to have installed the GCS Python client library and to set up the authentication credentials. You can find here information about how to implement these steps.
Upvotes: 2