James
James

Reputation: 7543

Azure Functions - bind a list of blobs from a queue trigger

I am writing a function to stitch lots of input images together into one tall image, and save that to an output blob. Is there a way to bind to the variable number of inputs from a queue trigger?

Here's an example queue item:

{
  "inputIds": [1001, 1002, 1003, 1004], // Input blobs "inputs/1001.jpg", etc.
  "outputId": 15                        // Output blob "output/15.jpg"
}

I've had a look at the options in the docs and my current thinking is to use a [Blob] attribute on a CloudBlobContainer which I think is allowed:

[FunctionName("stitch")]
public static async Task Run(
  [QueueTrigger("stitch")]QueueItem queueItem, 
  [Blob("inputs")]CloudBlobContainer inputContainer, 
  [Blob("results/{outputId}.jpg")]CloudBlockBlob output)
{
  IList<CloudBlockBlob> inputs = await GetBlobsAsync(inputContainer, queueItem.InputIds);
  Stream result = ConcatImages(inputs);
  await output.UploadFromStreamAsync(result);
}

In this case, QueueItem is a custom class and GetBlobsAsync() and ConcatImages() are functions I can implement.

But can I bind directly to an IEnumerable<CloudBlockBlob> (or IEnumerable<Stream>), and if so how would I write the [Blob] input attribute?

EDIT: And could I do so without loading all the input blob contents into memory at the same time? I need to stream them one at a time.

Upvotes: 1

Views: 633

Answers (1)

camelCase
camelCase

Reputation: 1790

I suggest your architecture is wrong.

You seem to be using the queue trigger correctly and the payload QueueItem which defines the scope of the imaging stitching task but why would you bind to the input blob storage? The input blobs need to be fetched dynamically within your code as you loop through the inputIds. Doing this also addresses your other concern about having all input blobs in memory at the same time.

Upvotes: 1

Related Questions