Amine CHERIFI
Amine CHERIFI

Reputation: 1175

writetime of cassandra row in spark

i'm using spark with cassandra, and i want to select from my cassandra table the writeTime of my row. This is my request :

   val lines = sc.cassandraTable[(String, String, String, Long)](CASSANDRA_SCHEMA, table).select("a", "b", "c", "writeTime(d)").count()

but it display this error :

java.io.IOException: Column channal not found in table test.mytable

I've tried also this request

   val lines = sc.cassandraTable[(String, String, String, Long)](CASSANDRA_SCHEMA, table).select("a", "b", "c", WRITETIME("d")").count()

but it display this error :

<console>:25: error: not found: value WRITETIME

Please how can i get the writeTime of my row. Thanks.

Upvotes: 2

Views: 2628

Answers (3)

Jacek L.
Jacek L.

Reputation: 1416

Take a look at this ticket.

For usage, take a look at integration tests here.

Upvotes: 3

Piotr Kołaczkowski
Piotr Kołaczkowski

Reputation: 2629

In cassandra-spark-connector 1.2, you can get TTL and write time by writing:

sc.cassandraTable(...).select("column1", WriteTime("column2"), TTL("column3"))

Upvotes: 4

RussS
RussS

Reputation: 16576

Edit: This has been fixed in the 1.2 release of the connector

Currently the Connector doesn't support passing through CQL functions when reading from Cassandra. I've taken note of this and will start up a ticket for implementing this functionality.

https://datastax-oss.atlassian.net/browse/SPARKC-55

For a workaround you can always use the direct connector within your operations like in

import com.datastax.spark.connector.cql.CassandraConnector

val cc = CassandraConnector(sc.getConf)
val select = s"SELECT WRITETIME(userId) FROM cctest.users where userid=?"
val ids = sc.parallelize(1 to 10)
ids.flatMap(id =>
      cc.withSessionDo(session =>
        session.execute(select, id.toInt: java.lang.Integer)

Code modified from Filter from Cassandra table by RDD values

Upvotes: 4

Related Questions