Climbs_lika_Spyder
Climbs_lika_Spyder

Reputation: 6714

How to use mutable map in Scala on Apache Spark? Key not found error

I am using Spark 1.3.0.
My map says it has key, but I get key not found or none when accessing the key.

import scala.collection.mutable.HashMap
val labeldata = sc.textFile("/home/data/trainLabels2.csv")
val labels: Array[Array[String]] = labeldata.map(line => line.split(",")).collect()
var fn2label: HashMap[String,Int] = new HashMap()
labels.foreach{ x => fn2label += (x(0) -> x(1).toInt)}

My map then looks like:

scala> fn2label
res45: scala.collection.mutable.HashMap[String,Int] = Map("k2VDmKNaUlXtnMhsuCic" -> 1, "AGzOvc4dUfw1B8nDmY2X" -> 1, "BqRPMt4QY1sHzvF6JK7j" -> 3,.....

It even has keys:

scala> fn2label.keys
res46: Iterable[String] = Set("k2VDmKNaUlXtnMhsuCic", "AGzOvc4dUfw1B8nDmY2X", "BqRPMt4QY1sHzvF6JK7j",

But I cannot access them:

scala> fn2label.get("k2VDmKNaUlXtnMhsuCic")
res48: Option[Int] = None

scala> fn2label("k2VDmKNaUlXtnMhsuCic")
java.util.NoSuchElementException: key not found: k2VDmKNaUlXtnMhsuCic

What I have tried includes broadcasting the map, broadcasting labels and the map, Map instead of HashMap, parallelizing as mentioned in https://stackoverflow.com/a/24734410/1290485

val mapRdd = sc.parallelize(fn2label.toSeq)
mapRdd.lookup("k2VDmKNaUlXtnMhsuCic")
res50: Seq[Int] = WrappedArray()

What am I missing??

Upvotes: 2

Views: 2613

Answers (1)

Nikita
Nikita

Reputation: 4515

You just have extra quotes in your data:

scala> val fn2label = scala.collection.mutable.HashMap("\"k2VDmKNaUlXtnMhsuCic\"" -> 1, "\"AGzOvc4dUfw1B8nDmY2X\"" -> 1, "\"BqRPMt4QY1sHzvF6JK7j\"" -> 3)
fn2label: scala.collection.mutable.HashMap[String,Int] = Map("BqRPMt4QY1sHzvF6JK7j" -> 3, "AGzOvc4dUfw1B8nDmY2X" -> 1, "k2VDmKNaUlXtnMhsuCic" -> 1)

scala> fn2label.get("\"k2VDmKNaUlXtnMhsuCic\"")
res4: Option[Int] = Some(1)

scala>  fn2label.keys
res5: Iterable[String] = Set("BqRPMt4QY1sHzvF6JK7j", "AGzOvc4dUfw1B8nDmY2X", "k2VDmKNaUlXtnMhsuCic")

Upvotes: 8

Related Questions