techresearch
techresearch

Reputation: 169

Handling large data of API in LogicApp

In the LogicApp I am not able to call a HTTP API which has more than 100 MB of data. I do get below error message.

BadRequest. Http request failed as there is an error: 'Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600.'.

I can't enable chunking in connector as chunking is not enabled in the API.(API provider won't support it). So I tried calling this API is Azure function and returning data to LogicApp. But this also doesn't help as I got same error in this approach as well. So my thought was can we call the API in Azure function and convert the response as chunked response in the Azure function, in LogicApp we can read this data by enabling Chunk. I am not sure if this possible, If possible how we can achieve this in Azure function. If not what are the various solutions handle large data from API.

Upvotes: 0

Views: 1754

Answers (2)

SwethaKandikonda
SwethaKandikonda

Reputation: 8244

You may try the below approach:

  1. Logic Apps Resource -> Logic Apps Designer

  2. Under Built-in you can find Batch Messages trigger and fill the required details according to the your requirement

    enter image description here

  3. Now click on New step

    enter image description here

  4. Under Built-in choose HTTP

  5. In the HTTP action's upper-right corner, choose the ellipsis button (...), and then choose Settings.

    enter image description here

  6. Under Content Transfer, set Allow chunking to On.

    enter image description here

Upvotes: 1

Sergiy Kostenko
Sergiy Kostenko

Reputation: 283

I guess issue is in your app architecture. What kind of data are using? If it is big blobs, you can save them in azure storage account. If it lot`s of messages (e.g. iot sensors data) you can use Azure stream analytics, storage queues to process them one by one.

Upvotes: 0

Related Questions