Jacek Laskowski
Jacek Laskowski

Reputation: 74749

How to check Delta Lake version in Databricks notebook?

How to check the delta lake version in the databricks notebook?

(from slack)

Upvotes: 6

Views: 7350

Answers (4)

partlov
partlov

Reputation: 14277

In theory this should work:

import importlib_metadata


delta_version = importlib_metadata.version("delta_spark")

They use same way to detect version: https://github.com/delta-io/delta/blob/b44bd8a9023a318325fe8738a5c56b1325ed56b7/python/delta/pip_utils.py#L69

Upvotes: 1

keen
keen

Reputation: 31

As far as I can tell, unfortunately, there is no straight forward way.

However, searching for Delta Lake JAR-files might give an indication. At least on Azure Synapse Analytics this works:

import os
import re

def find_files(filename, search_path):
    result = []
    for root, dir, files in os.walk(search_path):
        filtered_files = [file for file in files if re.match(f"{filename}", file)]
        for file in filtered_files:
            result.append(os.path.join(root, file))
    return result
>>> print(find_files("delta-.*\.jar","/"))
['/usr/hdp/5.2-80682642/spark3/jars/delta-core_2.12-2.1.1.2.jar', '/usr/hdp/5.2-80682642/spark3/jars/delta-storage-2.1.1.2.jar']

And one more way that might indicate the version:

>>> spark.conf.get("spark.dotnet.packages").split(';')
['nuget:Microsoft.Spark,2.1.0-prerelease.22115.1',
 'nuget:Microsoft.Spark.Extensions.DotNet.Interactive,2.1.0-prerelease.22115.1',
 'nuget:Microsoft.Spark.Extensions.Delta,2.1.0-prerelease.22115.1',
 'nuget:Microsoft.Spark.Extensions.Hyperspace,2.1.0-prerelease.22115.1',
 'nuget:Microsoft.Spark.Extensions.Azure.Synapse.Analytics,0.15.0']

Upvotes: 1

sumitya
sumitya

Reputation: 2691

How about checking with databricks's dbutils.

println(dbutils.notebook.getContext.tags("sparkVersion"))

Upvotes: 0

Jacek Laskowski
Jacek Laskowski

Reputation: 74749

You’d have to get the runtime version and from that match it up with lake version of included in the runtime.

spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")

and then check the build notes.

Upvotes: 0

Related Questions