Mnemosyne
Mnemosyne

Reputation: 1192

Cassandra-Spark Connector uploading by parsing arguments

I am using the spark-cassandra-connector in scala and I want to upload some entries into a table. I have seen the following uploading method using the session construct for the python driver:

session.execute(
    """
    INSERT INTO users (name, credits, user_id)
    VALUES (%s, %s, %s)
    """,
    ("John O'Reilly", 42, uuid.uuid1())
)

Does the spark-connector support a similar way for parsing the arguments in an upload and if so how would the construct look like? When I tested the above methodology it did not work.

Upvotes: 0

Views: 30

Answers (1)

RussS
RussS

Reputation: 16576

The Spark Cassandra Connector is primarily made for the manipulation of Cassandra data using Spark. This means if you aren't using the words Dataset, Dataframe, or RDD you probably don't need to be using the Spark Cassandra Connector.

The format you are using above is valid in the Java Driver which is included as part of the Spark Cassandra connector and can be accessed via the CassandraConnector wrapper. As is explained in the Documentation

import com.datastax.spark.connector.cql.CassandraConnector

CassandraConnector(conf).withSessionDo { session =>
  session.execute("CREATE KEYSPACE test2 WITH REPLICATION = {'class': 'SimpleStrategy', 'replication_factor': 1 }")
  session.execute("CREATE TABLE test2.words (word text PRIMARY KEY, count int)")
}

Upvotes: 1

Related Questions