Reputation: 595
I am saving my predicted results to a Cassandra DB using spark Cassandra connector using below codes:
CassandraJavaUtil.javaFunctions(sensorDataRDD).writerBuilder(modelParamter.keyspace, "sensor_data_2",
CassandraJavaUtil.mapToRow(SensorData2Double.class)).saveToCassandra();
The data is based on time-stamps scaled in seconds. Therefore, the predicted writing could be scaled in hour. I need to first delete all previous records. The delete should occur on a special column in Cassandra table providing with its unique key.
I am not sure how to delete all previous records to make sure that when I insert new records using above java code, they are not deleted by my cassandra delete query afterwards.
Is there any atomicity on Cassandra columns when I delete or insert rows (primary keys)?
Upvotes: 1
Views: 357
Reputation: 2466
Why do you need to delete previous data?
If you are writing new data for same keys it will be simply overwritten.
If don't want to delete "new" data, you can check write time of the value (writetime function), and if it's fresh enough - don't delete it.
Upvotes: 0