Hugobop
Hugobop

Reputation: 470

Azure function to send data to a blob container as the output binding

This answer does help some but I am looking for more clarity and examples, ideally for a node environment (I'll take a high-level approach in a different programming language if you got it). I am on the newer side to cloud-concepts; so its possible I am missing some high level understanding.

In the above link, @harikrishnarajoli-mt explains a similar scenario:

Stored Queue's data from the blob container acts as output binding Scenario: Azure Function-HTTP Trigger using EF Core. Upon saving the data to the database, Queue will be used as an output binding to save the data to the local storage. Secondly, another function will be created upon insertion of data to the Queue store. Here Queue acts as an input binding and the queue's data will be stored in the Blob container as an output Binding. Blob container is used as output binding by specifying the Blob attribute for every queue trigger.

I understand how to create an HTTP-trigger that outputs to sending a message to the storage-queue. I am stuck on the latter half. How to have the data be stored in the Blob container as an output Binding?

My specific situation: I have an API hosted say on somesite.com/route1. I want to store all the requests/responses being made to that endpoint in azure-storage. EDIT: my gameplane is >> http triggers queue message >> queue message triggers the blob to be sent.

Right now I have a function that sends to the queue but I cannot for the life of me figure out how to send to blob-storage using an Azure function. This page from Microsoft does not make sense as I don't see the data going to a blob-container.

My Files...

function.json

{
  "scriptFile": "./index.js",
  "bindings": [
    {
      "authLevel": "function",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req",
      "methods": [
        "get",
        "post",
        "put",
        "delete"
      ]
    },
    {
      "type": "queue",
      "direction": "out",
      "name": "outputQueue",
      "queueName": "apirequests",
      "connection": ""
    },
    {
    WHAT GOES IN HERE SO THAT THE OUTPUT CAN GO TO THE BLOB-CONTAINER? 
    }

  ]
}

index.js

const { v4: genId } = require('uuid');
module.exports = async function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');
    const reqBody = JSON.stringify(req.body);

    const wholeObj = {
        GUID: genId(),
        method: req.method,
        url: req.url,
        request: {
            headers: req.headers,
            body: reqBody
        }
        response: {..}
       ...more stuff...
    }
    context.bindings.outputQueue = `${wholeObj.GUID}.json`; // this line sends the GUID to the queue.

// how can I send the wholObj as a blob to my blob-container?

}

Is it possible to send data to blob-container through a function? I only see it being done with a package like azure-storage, but I want to confirm if it can be done with a function instead? Please share any advice you have on what to add to these two files to achieve my desired result? Or if there is a different approach you think I should take, please share!

Thank you

Upvotes: 0

Views: 3748

Answers (3)

Muhammad Ahmod
Muhammad Ahmod

Reputation: 709

There are multiple function triggers. An azure function can have only 1 trigger.

Azure functions can have multiple input and multiple output bindings.

You need 2 functions:

  1. httptrigger function with a queue storage output binding. Source

[StorageAccount("MyStorageConnectionAppSetting")]
public static class QueueFunctions
{
    [FunctionName("QueueOutput")]
    [return: Queue("myqueue-items")]
    public static string QueueOutput([HttpTrigger] dynamic input,  ILogger log)
    {
        log.LogInformation($"C# function processed: {input.Text}");
        return input.Text;
    }
}

  1. queuestoragetrigger function with a blob storage output binding. Source

 [FunctionName("queue")]  
    public static void Run([QueueTrigger("queue", Connection = "")]Queue queue,  
      [Blob("storage/{name}")] out string acceptedCCApplication,    
      Ilog log)  
    {  
      log.Info($"C# Queue trigger);
  
    }  

The httptrigger function will output to the storage queue. Once that message arrives in the storage queue it will trigger your queue storage function which will output to your blob storage.

If you want to output to both the queue and blob at the same time then you only require 1 http trigger function with 2 output bindings of queue and blob.

Additionally you can skip the queue storage if you are not doing heavy processing of what you are storing. Just create a httptrigger function with a blob storage output binding.

The examples above are based on .net. They might have a few errors but that’s the jest of it.

Upvotes: 1

Hugobop
Hugobop

Reputation: 470

Everything cleared up for me when I read this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-expressions-patterns

The path property holds the information for the container. i.e.:

Most expressions are identified by wrapping them in curly braces. For example, in a queue trigger function, {queueTrigger} resolves to the queue message text. If the path property for a blob output binding is container/{queueTrigger} and the function is triggered by a queue message HelloWorld, a blob named HelloWorld is created.

So I added below to my function.json for my queue-trigger function and it works appropriately.

....
    {
      "name": "myOutputBlob",
      "type": "blob",
      "path": "containerName/{queueTrigger}",
      "connection": "",
      "direction": "out"
    }

Upvotes: 1

Brent George
Brent George

Reputation: 148

I'm not sure of the scope of your project, but an alternative approach to connect your Node app to various Azure components would be Dapr.

Dapr serves almost as a middle-man API that abstracts away many of the details of actually connecting to Blob Storage, Event Hubs, or any other Azure component you may be using. This is nice because it simplifies connecting new components, and makes it easy to switch out components on your backend if you change cloud providers, without having to change any of your code.

There is a tutorial for output bindings. I am using Dapr for binding to an EventHub and writing to Blob Storage as well.

Upvotes: 1

Related Questions