Reputation: 199
My use case is very simple, I'm fetching raw JSON response form my REST API and keeping it as dictionary in python, i have to write this data into google cloud storage. is there any approach other than "upload_from_string" option ?
Upvotes: 5
Views: 3803
Reputation: 639
I have a similar use case. I want to throw some data, which is in a dictionary format into a google cloud storage bucket.
I'm assuming you already have created a bucket(it is a simple task if you are trying to do it programmatically).
from google.cloud import storage
import json
import os
def upload_to_gcloud(data: dict):
"""
this function take a dictionary as input and uploads
it in a google cloud storage bucket
"""
## your service-account credentials as JSON file
os.environ['GOOLE_APPLICATION_CREDENTAILS'] = "YOUR-SERVICE-ACCOUNT-CREDENTIALS-AS-JSON"
## instane of the storage client
storage_client = storage.Client()
## instance of a bucket in your google cloud storage
bucket = storage_client.get_bucket("your-bucket-name")
## if you want to create a new file
blob = bucket.blob("filename-you-want-here")
## if there already exists a file
blob = bucket.get_blob("filename-of-that-file")
## uploading data using upload_from_string method
## json.dumps() serializes a dictionary object as string
blob.upload_from_string(json.dumps(data))
this approach will work with any data which you can pose as a string. if you want to directly upload a file from your local filesystem, use upload_from_filename() instead.
Hope this helps!!
Upvotes: 3
Reputation: 76010
For uploading data to Cloud Storage, you have only 3 methods in Python blobs object:
From dict, it's up to you to choose to convert it into a string and use upload_from_string
method. Or you can also store it locally, in the /tmp
directory (in memory file system), and then use file
purpose methods.
You maybe have more capabilities with files, if you want to zip the content and/or use dedicated library that dump a dict into a file.
Upvotes: 3