Sri
Sri

Reputation: 643

Spark reading the Cassandra UDT column

I have column in my Cassandra table with name 'ABC_lines'.

It is a UDT data type. The UDT name is "ABC_om_line"

If i look at the schema of the table ABC_lines this is how my column type will look like.

ABC_lines list>

I have 30 columns under this UDT "ABC_om_line". like col1,col2,col3 and so on to col30.

Now i want to create a dataframe by pulling just col2 and col3 from this UDT.

Can anyone please help.

Am using Spark 1.6 Scala 2.10

Upvotes: 1

Views: 998

Answers (1)

doanduyhai
doanduyhai

Reputation: 8812

Unfortunately the connector cannot map collections of UDT, see here: https://github.com/datastax/spark-cassandra-connector/blob/master/doc/6_advanced_mapper.md#using-custom-field-types

Custom converters for collections are not supported

Upvotes: 2

Related Questions