tarun
tarun

Reputation: 216

Spark get ttl column from cassandra

I am trying to get the ttl column from cassandra, but so far I couldn't make it work.

Here is what I tried so far:

SparkSession sparkSession = SparkSession.builder()
        .appName("Spark Sql Job").master("local[*]")
        .config("spark.sql.warehouse.dir", "file:///c:/tmp/spark-warehouse")
        .config("spark.cassandra.connection.host", "localhost")
        .config("spark.cassandra.connection.port", "9042")
        .getOrCreate();

SQLContext sqlCtx = sparkSession.sqlContext(); 

Dataset<Row> rowsDataset = sqlCtx.read()
        .format("org.apache.spark.sql.cassandra")
        .option("keyspace", "myschema")
        .option("table", "mytable").load();

rowsDataset.createOrReplaceTempView("xyz");   
rowsDataset = sparkSession.sql("select ttl(emp_phone) from vouchers");   
rowsDataset.show();

Upvotes: 0

Views: 798

Answers (1)

Serge Harnyk
Serge Harnyk

Reputation: 1339

From spark-cassandra-connector doc:

The select method allows querying for TTL and timestamp of the table cell.

Example Using Select to Retreive TTL and Timestamp

val row = rdd.select("column", "column".ttl, "column".writeTime).first
val ttl = row.getLong("ttl(column)")
val timestamp = row.getLong("writetime(column)")       

The selected columns can be given aliases by calling as on the column selector, which is particularly handy when fetching TTLs and timestamps.

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/3_selection.md

Upvotes: 3

Related Questions