Reputation: 131
I'm new to Scala and I have difficulties writing a spark-sql application to dynamically load user classes and map rdds to it.
rdd.map(line => {
val cls = Class.forName("UserClass")
val constructor = cls.getConstructor(classOf[String], classOf[String])
Tuple1(constructor.newInstence(line._1, line._2)).asInstanceOf[cls.type]
}).toDF()
The problem is converting the object to its declared class, as cls.type returns java.lang.class[_], which is not expected. At runtime the following exception would be threw:
java.lang.UnsupportedOperationException: Schema for type java.lang.class[_] is not supported
BTW, I'm using Scala 2.10 and spark 1.6.1.
Any suggestions and comments would be appreciated! Thanks!
Upvotes: 1
Views: 315
Reputation: 15086
I don't really have a solution, but I can tell you some things you're doing wrong.
You wrap an object in a Tuple1
and then you try to cast the tuple to a different type, instead of the object itself.
cls.type
is not the type that the Class
cls
represents. It is the type of the variable cls
, which in this case happens to be java.lang.Class[_]
.
Casting is mainly a compile time thing. So you can only cast to types that are known at compile time. You say that you are dynamically loading classes, so I guess that they are not known to the compiler.
Upvotes: 1