ratnamohan b
ratnamohan b

Reputation: 31

how to do BETWEEN condition in spark 1.6 version

I have tried between condition in spark 1.6 version but got an error as

between is not member of string.

df.filter($"date".between("2015-07-05", "2015-09-02"))

Upvotes: 2

Views: 5712

Answers (1)

elcomendante
elcomendante

Reputation: 1161

Either your df("date") is a string type or your column date is not inferred as column. I have replicated your code and it does work on a column sent as which is java.sql.Timestamp:

val test= bigDF.filter($"sent_at".between("2015-07-05", "2015-09-02"))

ensure that your col date is a valid df column, try df("date") or col("date") and that it is saved as data time type ex:

case class Schema(uuid: String, sent_at: java.sql.Timestamp)

val df1 = df.as[Schema]

do df.printSchema() to verify the type of date column

Upvotes: 4

Related Questions