Reputation: 443
I have created a BlobTrigger in Azure Function App to read any new file getting inserted or updated in my Azure Blob storage.
Trigger is working properly to identify latest files inserted or updated in my blob container as well as I am able to print the json body of the file.
However when I am trying to store the json object in a variable to transform it, it throws an error.
I would like assign each key of the json to a variable. My json is
{
"name":"Saikat",
"id":"1234"
}
Below is code when I can print the json and error while trying to store it.
import logging
import azure.functions as func
import json
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
print("JSON Body",json.load(myblob))
import logging
import azure.functions as func
import json
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
print("JSON Body",json.load(myblob))
#Store JSON file
jsonData= json.load(myblob)
print("****jsonData*****",jsonData)
Upvotes: 2
Views: 4537
Reputation: 136126
Essentially you're getting this error is because you're reading from the stream twice. After the 1st read, the stream's read position is set at the end of the stream and that's why your 2nd read is failing.
Based on the comments below, since InputStream BytesIO object doesn't contain seek operation, the solution to your problem is to read the stream just once.
Try something like the following:
import logging
import azure.functions as func
import json
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
jsonData= json.load(myblob)
print("JSON Body",jsonData)
#Store JSON file
print("****jsonData*****",jsonData)
Upvotes: 2