Sreedhar
Sreedhar

Reputation: 30045

SQL Database using JDBC + parameterize SQL Query + Databricks

In Databricks am reading SQL table as

val TransformationRules = spark.read.jdbc(jdbcUrl, "ADF.TransformationRules", connectionProperties)
.select("RuleCode","SourceSystem","PrimaryTable", "PrimaryColumn", "SecondaryColumn", "NewColumnName","CurrentFlag")
.where("SourceSystem = 'QWDS' AND RuleCode = 'STD00003' ")

How can I parameterize SourceSystem and RuleCode in Where clause

Was referring to: https://learn.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases

Upvotes: 0

Views: 320

Answers (2)

Alfilercio
Alfilercio

Reputation: 1118

if you import the spark implicits, you can create references to columns with the dollar $ interpolator. Also, you can use the API with columns to make the logic, it will be something like this.

val sourceSystem = "QWDS"
val ruleCode = "STD00003"

import spark.implicits._
val TransformationRules = spark.read.jdbc(jdbcUrl, "ADF.TransformationRules", connectionProperties)
.select("RuleCode","SourceSystem","PrimaryTable", "PrimaryColumn", "SecondaryColumn", "NewColumnName","CurrentFlag")
.where($"SourceSystem" === sourceSystem && $"RuleCode" === ruleCode)

val ssColumn: Column = $"SourceSystem"

As you can see, the dollar will provide a Column object, with logic like cooperation, casting renaming etc. In combination with the functions in org.apache.spark.sql.function will allow you to implement almost all you need.

Upvotes: 1

Philipp Hölscher
Philipp Hölscher

Reputation: 71

As far as i unterstand your question correctly, you want to insert values into the where clause string? Maybe then the underneath solutions could work for you:

val TransformationRules = spark.read.jdbc(jdbcUrl, "ADF.TransformationRules", connectionProperties)
.select("RuleCode","SourceSystem","PrimaryTable", "PrimaryColumn", "SecondaryColumn", "NewColumnName","CurrentFlag")
.where("SourceSystem = '{}' AND RuleCode = '{}' ".format(sourceSystem, ruleCode))

Upvotes: 0

Related Questions