user6559130
user6559130

Reputation: 59

Build Spark SQL query dynamically

How we can pass a column name and operator name dynamically to the SQL query with Spark in Scala?

I tried (unsuccessfully) the following:

spark.sql("set key_tbl=mytable")
spark.sql("select count(1) from ${key_tbl}").collect()

Upvotes: 5

Views: 16953

Answers (4)

sunny_singh
sunny_singh

Reputation: 1

I tried and answer is : You make query as :

s"""(SELECT * FROM param='$param')foo"""

Upvotes: 0

user6559130
user6559130

Reputation: 59

i done  with this ,as following:

val tablename="yourtablename"
val columnname="name"
val value=""
val operator="=="
spark.sql("select * from"+ tablename+"where"+name+operator+value)

Upvotes: 0

stefanobaghino
stefanobaghino

Reputation: 12804

Quite more simply, you should be able to do something like the following:

val key_tbl = "mytable"
spark.sql(s"select count(1) from ${key_tbl}").collect()

Notice the s before the query string: this uses Scala's string interpolation to build the query with another variable (key_tbl).

You can read more on String interpolation in Scala here.

Upvotes: 5

arjunsv3691
arjunsv3691

Reputation: 829

You can pass it as parameter as shown below

val param = "tableName" 
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
sqlContext.sql(s"""SELECT * FROM param=$param""")

can check this link for more details https://forums.databricks.com/questions/115/how-do-i-pass-parameters-to-my-sql-statements.html

Upvotes: 8

Related Questions