tarun
tarun

Reputation: 216

Apache Spark SQL read and write Cassandra TTL

How does spark sql reads apache cassandra data ttl with Dataframe (Spark Sql). I don't find any example? Can i get an example?

My previous question was on rdd Does Spark: Spark get ttl column from cassandra

But now the query is for Dataframe.

Upvotes: 0

Views: 592

Answers (1)

Alex Ott
Alex Ott

Reputation: 87099

The Spark Cassandra Connector doesn't support TTL and WriteTime in DataFrame API. You can track JIRA SPARKC-528 for progress.

You can read TTL and/or WriteTime using the RDD API (best with mapper to case class), and then convert into DataFrame.

Update, September 2020th: Support for TTL & WriteTime for Dataframes was released as part of SCC 2.5.0 release.

Upvotes: 0

Related Questions