sandeep varma
sandeep varma

Reputation: 1

Nested flatMap in spark

in the below given code snipet i have declared and rdd by parallelizing a List(1,2,3,4) what i wanted do was to append List(1,2,3,4) to each element of the above rdd. i did so by using nested flatMap function scince it can return multiple values for each element of a RDD .The code is as follows

val rand6=sc.parallelize(List(1,2,3,4))
val bv=sc.broadcast(List(5,6,7,8))
rand6.flatMap(s=>{
  val c=List(1,2,3,4)
  val a=List(s,c)
  val b=a.flatMap(r=>r)
  b
})

But I am getting the following Error

command-1095314872161512:74: error: type mismatch;
 found   : Any
 required: scala.collection.GenTraversableOnce[?]
  val b=a.flatMap(r=>r)
                     ^

is is the problem with the syntax or we are not supposed to use flatMaps in this fashion

it would be very helpful if someone can help me to understand this

Upvotes: 0

Views: 363

Answers (1)

QuickSilver
QuickSilver

Reputation: 4045

Try to add type wherever possible in your scala code Depending on your question description came up with below solution

import org.apache.spark.broadcast.Broadcast
import org.apache.spark.rdd.RDD

object RandomDF {

  def main(args: Array[String]): Unit = {

    val spark = Constant.getSparkSess
    val sc = spark.sparkContext
    val rand6 : RDD[Int] =sc.parallelize(List(1,2,3,4))
    val bv: Broadcast[List[Int]] =sc.broadcast(List(5,6,7,8))
    val output = rand6.map( (s : Int)=>{
      val c : List[Int] =List(1,2,3,4)
      val a = s :: c
//      val b = a.flatMap(r=>r)
//      b
      a
    }).collect().toList

    println(output)
  }

}

Upvotes: 1

Related Questions