Shrince
Shrince

Reputation: 101

How to use variables in SQL queries?

Since in SQL Server ,we can declare variables like declare @sparksql='<any query/value/string>' but in spark sql what alternative can be used . So that we don't need to hard code any values/query/strings.

Upvotes: 7

Views: 36145

Answers (5)

Chris Amelinckx
Chris Amelinckx

Reputation: 4482

An example, on a databricks SQL notebook.

Use two commands:

Cmd 1
CREATE WIDGET TEXT myVariable DEFAULT "1234";

Cmd 2
SELECT 1234 value, 1234 = ${myVariable} comparison
UNION ALL
SELECT 4567, 4567 = ${myVariable}

databricks notebook with above example

Upvotes: 0

Alex Ott
Alex Ott

Reputation: 87184

There is support for the variables substitution in the Spark, at least from version of the 2.1.x. It's controlled by the configuration option spark.sql.variable.substitute - in 3.0.x it's set to true by default (you can check it by executing SET spark.sql.variable.substitute).

With that option set to true, you can set variable to specific value with SET myVar=123, and then use it using the ${varName} syntax, like: select ${myVar}...

On Databricks, parser also recognizes that syntax, and creates a field to populate value, although it would be easier to use widgets from SQL as described in documentation

P.S. According to the code, besides variables themselves, it also supports getting the data from environment variables & from the Java system properties, like this:

select '${env:PATH}';
select '${system:java.home}';

P.S. This answer is about using variables defined in Spark SQL itself. If you're looking about using variables defined in Python/Scala in Spark SQL, then please refer to this answer.

Upvotes: 12

alex oro
alex oro

Reputation: 11

The following widget simple solution works well within Databricks Spark SQL. Cluster runs on Spark 3.0.1 | Scala 2.12. Once you establish a widget, the databricks cluster will list them at the top and display their values. This comes in handy when you establish multiple.

CREATE WIDGET TEXT tableName DEFAULT 'db.table'

SELECT * from  $tableName

Upvotes: 1

wBob
wBob

Reputation: 14389

If you are using a Databricks Notebook then one easy way is to use Scala or Python to declare the variable and execute the SQL statement.

Here's a simple example in Scala:

val x = 1

val df = spark.sql(s"select * from t where col1 = $x")

df.show()

Upvotes: 2

falcon-le0
falcon-le0

Reputation: 609

The short answer is no, Spark SQL does not support variables currently.

The SQL Server uses T-SQL, which is based on SQL standard extended with procedure programming, local variables and other features.

Spark SQL is a pure SQL, partially compatible with SQL standard. Since Spark 3.0, Spark SQL introduces two experimental options to comply with the SQL standard, but no variables support introduced there.

https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html

Upvotes: -2

Related Questions