Reputation: 1175
i'm using spark with cassandra, and i want to select from my cassandra table the writeTime of my row. This is my request :
val lines = sc.cassandraTable[(String, String, String, Long)](CASSANDRA_SCHEMA, table).select("a", "b", "c", "writeTime(d)").count()
but it display this error :
java.io.IOException: Column channal not found in table test.mytable
I've tried also this request
val lines = sc.cassandraTable[(String, String, String, Long)](CASSANDRA_SCHEMA, table).select("a", "b", "c", WRITETIME("d")").count()
but it display this error :
<console>:25: error: not found: value WRITETIME
Please how can i get the writeTime of my row. Thanks.
Upvotes: 2
Views: 2628
Reputation: 1416
Take a look at this ticket.
For usage, take a look at integration tests here.
Upvotes: 3
Reputation: 2629
In cassandra-spark-connector 1.2, you can get TTL and write time by writing:
sc.cassandraTable(...).select("column1", WriteTime("column2"), TTL("column3"))
Upvotes: 4
Reputation: 16576
Currently the Connector doesn't support passing through CQL functions when reading from Cassandra. I've taken note of this and will start up a ticket for implementing this functionality.
https://datastax-oss.atlassian.net/browse/SPARKC-55
For a workaround you can always use the direct connector within your operations like in
import com.datastax.spark.connector.cql.CassandraConnector
val cc = CassandraConnector(sc.getConf)
val select = s"SELECT WRITETIME(userId) FROM cctest.users where userid=?"
val ids = sc.parallelize(1 to 10)
ids.flatMap(id =>
cc.withSessionDo(session =>
session.execute(select, id.toInt: java.lang.Integer)
Code modified from Filter from Cassandra table by RDD values
Upvotes: 4