Reputation: 4650
In Scala/Spark DataFrame
dfReduced.schema.fieldNames
is a java String array (String[]). However,
dfReduced.schema.fieldNames.asInstanceOf[Seq[String]]
throws
java.lang.ClassCastException: [Ljava.lang.String; cannot be cast to scala.collection.Seq
Assigning same array to a Seq[String] is fine.
val f3:Seq[String]=dfReduced.schema.fieldNames
As a Java programmer this surprises me as both would require casting in Java. Can someone explain why there is this distinction in Scala
(Note, I'm not being critical, I just want to understand Scala better)
Upvotes: 0
Views: 295
Reputation: 2294
The reason why val f3:Seq[String]=dfReduced.schema.fieldNames
this is working is because In Scala there is implicit conversion available than can cast the Array[T]
to Seq[T]
implicitly
In Java there is no such type of implicit casting available.
As Leo C
mention in comment The difference is run-time type cast versus compile-time type ascription
. For more info you can refer to this link.
Hope this clears your dough
Thanks
Upvotes: 3