Reputation: 43
I am trying to retrieve the data from Queue using QueueTrigger and write the data into Blob. For the first message from queue trigger, the data has been successfully retrieved from queue and written into blob file. When the queue triggered for the second time, it has retrieved the new messages from queue and it has written the messages into blob file. But the second message has overwritten the first message in the blob file. Please let me know how to append the second message to the blob file without removing/deleting the first message. As I am new to Python and Azure functions, the coding might not be correct.
import logging
import azure.functions as func
def main(queuemsg: func.QueueMessage, outputblob: func.Out[str]):
msg = queuemsg.get_body()
logging.info('Python Queue trigger function processed %s', msg)
outputblob.set(msg)
Upvotes: 3
Views: 2335
Reputation: 57
I stumbled upon a similar problem of collating database query results and writing to a single blob using azure functions. This approach would work for your problem too I think. You just need to serialize the content for the output blob as a string.
import azure.functions as func
import json
import logging
def main(msgIn: func.QueueMessage, documents: func.DocumentList, outputBlob:
func.Out[str]) -> None:
if documents:
logging.info('documents found in cosmosdb, saving to blob...')
#need to serialize as a string to output to blob.
complete_docs_str = ','.join([doc.to_json() for doc in documents])
outputBlob.set(complete_docs_str)
logging.info('wrote to blob: %s', complete_docs_str)
Upvotes: 1
Reputation: 14324
For now it doesn't support to append to blob directly, so you could try other ways.
Firstly you could combine the input blob and the output blob, you could refer to the below code.
import logging
import azure.functions as func
def main(msg: func.QueueMessage,inputblob: func.InputStream,outputblob: func.Out[str]) -> None:
logging.info('Python queue trigger function processed a queue item: %s',
msg.get_body().decode('utf-8'))
outmessage=inputblob.read().decode("utf-8")+msg.get_body().decode('utf-8')
logging.info(outmessage)
outputblob.set(outmessage)
The other way is to use the azure storage blob SDK to implement it, there is an append_block method.
Upvotes: 2