michal.dul
michal.dul

Reputation: 175

Unable to compare dates in Spark SQL query

Using PySpark and JDBC driver for MySQL I am not able to query for columns of type date. java.lang.ClassCastException is thrown.

sqlContext = SQLContext(sc)
df = sqlContext.load(source="jdbc", url=url, dbtable="reports")
sqlContext.registerDataFrameAsTable(df, "reports")
df.printSchema()
# root
#  |-- id: integer (nullable = false)
#  |-- day: date (nullable = false)
query = sqlContext.sql("select * from reports where day > '2015-05-01'")
query.collect() # ... most recent failure: ... java.lang.ClassCastException

Changing day column's type to timestamp solves the problem, but I have to keep the original schema.

Upvotes: 3

Views: 21090

Answers (1)

Spiro Michaylov
Spiro Michaylov

Reputation: 3571

Looking at the relevant unit tests in the Spark source, it looks like you need an explicit cast:

select * from reports where day > cast('2015-05-01' as date)

There's no sign of it in the Spark SQL documentation, but it seems to have been available in Transact-SQL and Hive for some time.

Upvotes: 13

Related Questions