Reputation: 225
I am creating a small azure function - HTTP trigger that is hosted on the Azure Cloud. Its main function is to simply receive post requests with a CSV file as payload.
However, some of the files that are posted are larger than 100mb, which results in the following exception:
"Exception while executing function -> Exception binding parameter 'req' -> The maximum message size quota for incoming messages (104857600) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.
I have tried to google my way and based on the azure function documentation from Microsoft: "functions-bindings-http-webhook"
it was stated that
The HTTP request length is limited to 100 MB (104,857,600 bytes), and the URL length is limited to 4 KB (4,096 bytes). These limits are specified by the httpRuntime element of the runtime's Web.config file.
Yet, I am unable to actually locate the Web.config within my solution, so I expect this is not really applicable or am I just totally off here?
The project is generated from Visual Studio with the following related dependencies and extensions:
Microsoft.Net.Sdk.Functions (1.0.24)
Azure Functions and Web jobs Tools (15.10.2046.0)
Are there any ways to increase the maximum message size limit for the received requests?
Upvotes: 2
Views: 9447
Reputation: 331
Of course it is true, that passing large files to a storage account is the best practice.
Still, there is a config key for the Azure Function Settings to explicitly set the limit higher than 100MB.
The key name is FUNCTIONS_REQUEST_BODY_SIZE_LIMIT
Check this example on how to include this in the local.settings.json:
Azure-Samples/azure-functions-nodejs-stream
Upvotes: 0
Reputation: 5042
Does the App Service editor work for you?
However, I believe this approach is risky. In case of networking issues you can easily receive a timeout.
I would recommend that you upload your CSVs in a blob storage, and create a blob trigger for your function.
Upvotes: 2