user1582674
user1582674

Reputation:

How to sum a string column in rdd format?

I'm new to spark.I've loaded a csv file with sc.textFile . I want to use reduceByKey to sum a column which is in string type but contains numbers. When i try with something like this reduceByKey(_ + _) it just put the numbers next to eachother. How can I do?should i convert the column?

Upvotes: 2

Views: 755

Answers (1)

evan.oman
evan.oman

Reputation: 5572

You will need to parse the strings, for example:

scala> val rdd = sc.parallelize(Seq(("a", "1"), ("a", "2.7128"), ("b", "3.14"),
       ("b", "4"), ("b", "POTATO")))
rdd: org.apache.spark.rdd.RDD[(String, String)] = ParallelCollectionRDD[57] at parallelize at <console>:27

scala> def parseDouble(s: String) = try { Some(s.toDouble) } catch { case _ => None }
parseDouble: (s: String)Option[Double]

scala> val reduced = rdd.flatMapValues(parseDouble).reduceByKey(_+_)
reduced: org.apache.spark.rdd.RDD[(String, Double)] = ShuffledRDD[59] at reduceByKey at <console>:31

scala> reduced.collect.foreach{println}
(a,3.7128)
(b,7.140000000000001)

Upvotes: 3

Related Questions