adventureworks
adventureworks

Reputation: 422

Unable to connect to Azure Databricks SQL from Azure functions endpoint on Azure portal

I have developed Azure Function project library using VS 2022. I am accessing a table in Azure Databricks SQL via Apach Spark ODBC driver connection settings. I am able to connect to table and run the HTTP function endpoint locally (e.g. localhost). But after publishing to Azure portal the function endpoint returns "HTTP ERROR 500". It says <My_APP>.azurewebsites.net can't currently handle this request.

When I checked application insights for further log information I found below details.

Result: Failure Exception: System.DllNotFoundException: Dependency unixODBC with minimum version 2.3.1 is required. Unable to load shared library 'libodbc.so.2' or one of its dependencies. In order to help diagnose loading problems, consider using a tool like strace. If you're using glibc, consider setting the LD_DEBUG environment variable: /usr/share/dotnet/shared/Microsoft.NETCore.App/7.0.8/libodbc.so.2.so: cannot open shared object file: No such file or directory /home/site/wwwroot/runtimes/linux/lib/net7.0/libodbc.so.2.so: cannot open shared object file: No such file or directory /usr/share/dotnet/shared/Microsoft.NETCore.App/7.0.8/liblibodbc.so.2.so: cannot open shared object file: No such file or directory /home/site/wwwroot/runtimes/linux/lib/net7.0/liblibodbc.so.2.so: cannot open shared object file: No such file or directory /usr/share/dotnet/shared/Microsoft.NETCore.App/7.0.8/libodbc.so.2: cannot open shared object file: No such file or directory /home/site/wwwroot/runtimes/linux/lib/net7.0/libodbc.so.2: cannot open shared object file: No such file or directory /usr/share/dotnet/shared/Microsoft.NETCore.App/7.0.8/liblibodbc.so.2: cannot open shared object file: No such file or directory /home/site/wwwroot/runtimes/linux/lib/net7.0/liblibodbc.so.2: cannot open shared object file: No such file or directory at FetchData() in C:\MY_PROJECT\MyFunction.cs:line 54

May I ask to throw some light on how to troubleshoot or configure my function endpoint on azure portal?

I am very new to Azure infrastructure in terms of integrating, collaborating various components like Azure databricks. I greatly appreciate some insights here..

So far I have determined one thing that Azure Functions HTTP endpoints can communicate with Azure Databricks SQL tables and query them via ODBC driver.

What I am starting to think is as soon as I publish Azure Functions project and deploy to Azure portal, the same endpoints brings up HTTP 500 error. Meaning, between Azure Functions and Azure Databricks - that component I am missing.

Also, I have host.json file which might require some additional settings to support this connectivity.

Should I look into other Azure services like Azure SQL, Azure Data Factory, Azure Synapse which might be the gateway to Azure Databricks catalogs and I should be accessing them in Azure Functions?

Upvotes: 0

Views: 594

Answers (1)

Alex Ott
Alex Ott

Reputation: 87174

Looks like you don't have correct version of UnixODBC package installed - most probably you're using old Docker image or something like that (see this answer).

But really, for Azure Functions it's much more easier to use SQL Statement Execution REST API that doesn't need any external dependencies like ODBC drivers, etc.

Upvotes: 0

Related Questions