Farizrha
Farizrha

Reputation: 1

I can't download large size parquet files from azure storage Data lake using azure function http trigger with python script

I was trying to download various parquet files from Azure Data lake. However, it seems like there is some sort of size limit for me to download the file using the azure function

But then I try to run the azure function locally using vscode it takes so long it seems to be not working, then I tried to deploy and run it in the Azure cloud. However, it gives this error when I tried to test it by using the HTTP request GET method. Error while testing it in azure

So is there any limitation for us to download multiple and large file for example 100 MB of parquet file from azure data lake storage using azure function http trigger?

I expect that I able to download all of the files from azure data lake storage using azure function.

Upvotes: 0

Views: 322

Answers (1)

SwethaKandikonda
SwethaKandikonda

Reputation: 8254

You are receiving this error due to high memory consumption which exceeded the limits of the consumption plan. According to the official documentation, the consumption plan has Max memory is 1.5 GB per instance.

Resource Consumption Plan Dedicated Plan
Max memory (GB per instance) 1.5 1.75-14

You can either switch to a dedicated hosting plan which provides memory up to 1.75-14 GB along with other resources or reduce the amount of data that is being received in an instance.

Upvotes: 0

Related Questions