Ryan K
Ryan K

Reputation: 351

How to customize column mappings with Spark Cassandra Connector in Java?

I wanted to change a column mapping to be append. Is there a better way to customize the column mappings with Spark Cassandra Connector in Java than the following?

 ColumnName song_id = new ColumnName("song_id", Option.empty());
 CollectionColumnName key_codes = new ColumnName("key_codes", Option.empty()).append();
 List<ColumnRef> collectionColumnNames = Arrays.asList(song_id, key_codes);
 scala.collection.Seq<ColumnRef> columnRefSeq = JavaApiHelper.toScalaSeq(collectionColumnNames);

 javaFunctions(songStream)
                .writerBuilder("demo", "song", mapToRow(PianoSong.class))
                .withColumnSelector(new SomeColumns(columnRefSeq))
                .saveToCassandra();

This is taken from this Spark Streaming code sample.

Upvotes: 0

Views: 534

Answers (1)

RussS
RussS

Reputation: 16576

Just make your column ref's using the CollectionColumnName

Which has a constructor

case class CollectionColumnName(
    columnName: String,
    alias: Option[String] = None,
    collectionBehavior: CollectionBehavior = CollectionOverwrite) extends ColumnRef 

You can rename by setting alias and you can change the insert behavior with collectionBehavior which takes the following classes.

Api Link

/** Insert behaviors for Collections. */
sealed trait CollectionBehavior
case object CollectionOverwrite extends CollectionBehavior
case object CollectionAppend extends CollectionBehavior
case object CollectionPrepend extends CollectionBehavior
case object CollectionRemove extends CollectionBehavior

Which means you can just do

CollectionColumnName appendColumn = 
  new CollectionColumnName("ColumnName", Option.empty(), CollectionPrepend$.MODULE$);

Which looks a bit more Java-y and is a bit more explicit. Did you have any other goals for this code?

Upvotes: 1

Related Questions