Rasika
Rasika

Reputation: 507

Add Int values of RDD[String,Array[String,Int]]

I have a RDD[String,Array[String,Int]],

    ["abc",[("asd",1),("asd",3),("cvd",2),("cvd",2),("xyz",1)]]

I want to turn it into-

     ["abc",[("asd",4),("cvd",4),("xyz",1)]]

I tried-

     val y=hashedRdd.map(f=> (f._1,f._2.map(_._2).reduce((a,b)=>a+b)))

But this returns RDD[String,Int] I want the return in RDD[String,Array[String,Int]]

Upvotes: 1

Views: 328

Answers (2)

koiralo
koiralo

Reputation: 23119

You can group the Array and calculate the sum of values.

// Raw rdd
val hashedRdd = spark.sparkContext.parallelize(Seq(
  ("abc",Array(("asd",1),("asd",3),("cvd",2),("cvd",2),("xyz",1)))
))

//Group by first value and calculate the sum
val y = hashedRdd.map(x => {
  (x._1, x._2.groupBy(_._1).mapValues(_.map(_._2).sum))
})

Output:

y.foreach(println)
(abc,Map(xyz -> 1, asd -> 4, cvd -> 4))

Hope this helps!

Upvotes: 1

Andy Hayden
Andy Hayden

Reputation: 375915

One way would be to reduce on the tuples after groupBy(of the first entry):

@ hashedRdd.map { f => (f._1, f._2.groupBy{ _._1 }.map{ _._2.reduce{ (a,b)=>(a._1, a._2+b._2) } } )}.collect
res11: Array[(String, Map[String, Int])] = Array(("abc", Map("xyz" -> 1, "asd" -> 4, "cvd" -> 4)))

Upvotes: 0

Related Questions