Dev
Dev

Reputation: 13773

How to use dynamic values in Interval in Spark SQL query

A working Spark SQL:

SELECT current_timestamp() - INTERVAL 10 DAYS as diff from sample_table

The Spark SQL I tried (non-working):

SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table

The error gotten from the above query:

mismatched input 'DAYS' expecting 

== SQL ==
SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table
------------------------------------------^^^
" Traceback (most recent call last):
    File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
    File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in call answer, self.gateway_client, self.target_id, self.name)
    File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 73, in deco raise ParseException(s.split(': ', 1)[1], stackTrace) pyspark.sql.utils.ParseException: "
mismatched input 'DAYS' expecting 

== SQL ==
SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table
------------------------------------------^^^

I want to use col1 as a dynamic interval value. How can I achieve this?

Upvotes: 7

Views: 4119

Answers (1)

hytonenl
hytonenl

Reputation: 51

SparkSQL function make_interval achieves this:

SELECT current_timestamp() - make_interval(0, 0, 0, col1, 0, 0, 0) as diff from sample_table

Upvotes: 5

Related Questions