ozzieisaacs
ozzieisaacs

Reputation: 843

Spark Cassandra Connector timestamp

val timestamp: Long = System.currentTimeMillis - (2629746 * 1000)
sc.cassandraTable("keyspace", "users").select("id").where("timestamp > ?", timestamp).cassandraCount()

"timestamp" here is a standard cassandra type timestamp and formatted as such.

I want to convert my timestamp to the correct format so i can find any users record that has a timestamp that was updated in the last 30 days but I am not sure how to format it correctly in scala. I see there is a TimestampFormatter class in the datastax cassandra connector but I can't make it work for me.

Upvotes: 1

Views: 1724

Answers (1)

flavian
flavian

Reputation: 28511

You would not really use that approach there, you would instead assign a timeuuid to individual updates. So an update would have the clustering key as timestamp timeuuid with CLUSTERING ORDER BY(timestamp descending).

import com.datastax.driver.core.utils.UUIDs

sc.cassandraTable("keyspace", "users")
  .select("id")
  .where("timestamp > ?", UUIDs.startOf(timeuuid))
  .cassandraCount()

Or you could just use minTimeUUID and maxTimeUUID.

val now = DateTime.now(DateTimeZone.UTC)
val start = now.plusDays(-30)


sc.cassandraTable("keyspace", "users")
  .select("id")
  .where("timestamp >= minTimeUUID(start.toISOString))
  .and(("timestamp <= maxTimeUUID(now.toISOString)))
  .cassandraCount()

Upvotes: 0

Related Questions