Naren
Naren

Reputation: 457

How to map a RDD to another RDD with Scala, in Spark?

I have a RDD:

RDD1 = (big,data), (apache,spark), (scala,language) ... 

and I need to map that with the time stamp

RDD2 = ('2015-01-01 13.00.00')

so that I get

RDD3 = (big, data, 2015-01-01 13.00.00), (apache, spark, 2015-01-01 13.00.00), (scala, language, 2015-01-01 13.00.00)

I wrote a simple map function for this:

RDD3 = RDD1.map(rdd => (rdd, RDD2))

but it is not working, and I think it is not the way to go. How to do it? I am new to Scala and Spark. Thank you.

Upvotes: 0

Views: 1087

Answers (1)

Peter Neyens
Peter Neyens

Reputation: 9820

You can use zip:

val rdd1 = sc.parallelize(("big","data") :: ("apache","spark") :: ("scala","language") :: Nil)
// RDD[(String, String)]
val rdd2 = sc.parallelize(List.fill(3)(new java.util.Date().toString))
// RDD[String]

rdd1.zip(rdd2).map{ case ((a,b),c) => (a,b,c) }.collect()
// Array((big,data,Fri Jul 24 22:25:01 CEST 2015), (apache,spark,Fri Jul 24 22:25:01 CEST 2015), (scala,language,Fri Jul 24 22:25:01 CEST 2015))

If you want the same time stamp with every element of rdd1 :

val now = new java.util.Date().toString
rdd1.map{ case (a,b) => (a,b,now) }.collect()

Upvotes: 5

Related Questions