Reputation: 3736
I'm working on Spark/Cassandra application (java) and encountered a problem when reading/mapping UDT values.
Namely,
CassandraJavaRDD<Pojo> rdd = javaFunctions(sc).cassandraTable("keyspace", "table", mapRowTo(Pojo.class));
works fine when mapping is done from C* table containing primitive types only, but fails with NPE
Requested a TypeTag of the GettableToMappedTypeConverter which can't deserialize TypeTags due to Scala 2.10 TypeTag limitation. They come back as nulls and therefore you see this NPE.
if UDTs for some columns are used.
What's the best way to bypass this NPE and achieve correct deserialization?
Thnx
PS. Strangely, writing Java POJOs to C* table works fine (both to primitives and UDTs) with trivial code like:
javaFunctions(rdd).writerBuilder("keyspace", "table", mapToRow(Pojo.class)).saveToCassandra();
Upvotes: 2
Views: 524
Reputation: 101
It happens if you have nulls in your UDTValue in Cassandra. Spark fails to provide correct error message due to TypeTag limitations.
You need to make sure nullable fields in your Pojo class are declared as Optional.
Upvotes: 1