Reputation: 636
I want to upload an image on Google Cloud Storage from a python script. This is my code:
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient import discovery
scopes = ['https://www.googleapis.com/auth/devstorage.full_control']
credentials = ServiceAccountCredentials.from_json_keyfile_name('serviceAccount.json', scop
es)
service = discovery.build('storage','v1',credentials = credentials)
body = {'name':'my_image.jpg'}
req = service.objects().insert(
bucket='my_bucket', body=body,
media_body=googleapiclient.http.MediaIoBaseUpload(
gcs_image, 'application/octet-stream'))
resp = req.execute()
if gcs_image = open('img.jpg', 'r')
the code works and correctly save my image on Cloud Storage. How can I directly upload a bytes image? (for example from an OpenCV/Numpy array: gcs_image = cv2.imread('img.jpg')
)
Upvotes: 12
Views: 18174
Reputation: 962
Here is how to directly upload a PIL Image from memory:
from google.cloud import storage
import io
from PIL import Image
# Define variables
bucket_name = XXXXX
destination_blob_filename = XXXXX
# Configure bucket and blob
client = storage.Client()
bucket = client.bucket(bucket_name)
im = Image.open("test.jpg")
bs = io.BytesIO()
im.save(bs, "jpeg")
blob.upload_from_string(bs.getvalue(), content_type="image/jpeg")
In addition to that, here is how to download blobfiles directly to memory as PIL Images:
blob = bucket.blob(destination_blob_filename)
downloaded_im_data = blob.download_as_bytes()
downloaded_im = Image.open(io.BytesIO(downloaded_im_data))
Upvotes: 1
Reputation: 1088
In my case, I wanted to upload a PDF document to Cloud Storage from bytes.
When I tried the below, it created a text file with my byte string in it.
blob.upload_from_string(bytedata)
In order to create an actual PDF file using the byte string I had to do:
blob.upload_from_string(bytedata, content_type='application/pdf')
My byte data was b64encoded, so I also had b64decode it first.
Upvotes: 18
Reputation: 705
If you want to upload your image from file.
import os
from google.cloud import storage
def upload_file_to_gcs(bucket_name, local_path, local_file_name, target_key):
try:
client = storage.Client()
bucket = client.bucket(bucket_name)
full_file_path = os.path.join(local_path, local_file_name)
bucket.blob(target_key).upload_from_filename(full_file_path)
return bucket.blob(target_key).public_url
except Exception as e:
print(e)
return None
but if you want to upload bytes directly:
import os
from google.cloud import storage
def upload_data_to_gcs(bucket_name, data, target_key):
try:
client = storage.Client()
bucket = client.bucket(bucket_name)
bucket.blob(target_key).upload_from_string(data)
return bucket.blob(target_key).public_url
except Exception as e:
print(e)
return None
note that target_key
is prefix and the name of the uploaded file.
Upvotes: 9
Reputation: 1572
MediaIoBaseUpload
expects an io.Base
-like object and raises following error:
'numpy.ndarray' object has no attribute 'seek'
upon receiving a ndarray object. To solve it I am using TemporaryFile
and numpy.ndarray().tofile()
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient import discovery
import googleapiclient
import numpy as np
import cv2
from tempfile import TemporaryFile
scopes = ['https://www.googleapis.com/auth/devstorage.full_control']
credentials = ServiceAccountCredentials.from_json_keyfile_name('serviceAccount.json', scopes)
service = discovery.build('storage','v1',credentials = credentials)
body = {'name':'my_image.jpg'}
with TemporaryFile() as gcs_image:
cv2.imread('img.jpg').tofile(gcs_image)
req = service.objects().insert(
bucket='my_bucket’, body=body,
media_body=googleapiclient.http.MediaIoBaseUpload(
gcs_image, 'application/octet-stream'))
resp = req.execute()
Be aware that googleapiclient is non-idiomatic and maintenance only(it’s not developed anymore). I would recommend using idiomatic one.
Upvotes: 1