Reputation: 169
In the LogicApp I am not able to call a HTTP API which has more than 100 MB of data. I do get below error message.
BadRequest. Http request failed as there is an error: 'Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600.'.
I can't enable chunking in connector as chunking is not enabled in the API.(API provider won't support it). So I tried calling this API is Azure function and returning data to LogicApp. But this also doesn't help as I got same error in this approach as well. So my thought was can we call the API in Azure function and convert the response as chunked response in the Azure function, in LogicApp we can read this data by enabling Chunk. I am not sure if this possible, If possible how we can achieve this in Azure function. If not what are the various solutions handle large data from API.
Upvotes: 0
Views: 1754
Reputation: 8244
You may try the below approach:
Logic Apps Resource -> Logic Apps Designer
Under Built-in you can find Batch Messages trigger and fill the required details according to the your requirement
Now click on New step
Under Built-in choose HTTP
In the HTTP action's upper-right corner, choose the ellipsis button (...), and then choose Settings.
Under Content Transfer, set Allow chunking to On.
Upvotes: 1
Reputation: 283
I guess issue is in your app architecture. What kind of data are using? If it is big blobs, you can save them in azure storage account. If it lot`s of messages (e.g. iot sensors data) you can use Azure stream analytics, storage queues to process them one by one.
Upvotes: 0