Reputation: 17676
I have s simple map like
val parameters: Map[String, Any] = Map("digits" -> Seq(1, 2, 3, 4, 5, 6, 7, 8,
an want to multiply each number by 3 like shown below
class PrintMap extends App {
val conf: SparkConf = new SparkConf()
.setAppName("sparkApiSample")
.setMaster("local[*]")
val session: SparkSession = SparkSession
.builder()
.config(conf)
.getOrCreate()
val parameters: Map[String, Any] = Map("digits" -> Seq(1, 2, 3, 4, 5, 6, 7, 8, 9, 0))
val numbers: Seq[Int] = parameters("digits").asInstanceOf[Seq[Int]]
val rdd = session.sparkContext.parallelize(numbers)
val result = Map("result" -> rdd.map(x => x * 3).collect())
// want to "access / print the contents of the Array at result
result.get("result") match {
case Some(x) => x.asInstanceOf[Seq[Any]].foreach(println)
case None => println("error occurred")
}
Why does it result in the following exception and how could I actually access the map? java.lang.ClassCastException: [I cannot be cast to scala.collection.Seq
Upvotes: 1
Views: 308
Reputation: 746
Collect on an RDD returns an Array. Array does not extend Seq. So your x cannot be cast to a Seq.
eg.)
Array(2).asInstanceOf[Seq[Int]]
Throws the same exception.
Instead your result should be of type: Map[String, Array[Int]]
So instead just use x.toSeq isntead of x.asInstanceOf[Seq[Int]]
EDIT: "[I" in your stack means Array of Int.
It occurred to me as I was writing this, I'm guessing the reason you're using any is because each your have a bunch of different parameters and return types in your array. If this is the case it'd be handy see a slightly more complete example.
Upvotes: 3