Georg Heiler
Georg Heiler

Reputation: 17676

Convert DataFrame to RDD[Map] in Scala

I want to convert an array created like:

case class Student(name: String, age: Int)
val dataFrame: DataFrame = sql.createDataFrame(sql.sparkContext.parallelize(List(Student("Torcuato", 27), Student("Rosalinda", 34))))

When I collect the results from the DataFrame, the resulting array is an Array[org.apache.spark.sql.Row] = Array([Torcuato,27], [Rosalinda,34])

I'm looking into converting the DataFrame in an RDD[Map] e.g:

Map("name" -> nameOFFirst, "age" -> ageOfFirst)
Map("name" -> nameOFsecond, "age" -> ageOfsecond)

I tried to use map via: x._1 but that does not seem to work for Array [spark.sql.row] How can I anyway perform the transformation?

Upvotes: 0

Views: 5308

Answers (2)

Colin Wang
Colin Wang

Reputation: 851

In other words, you could transform row of dataframe to map, and below works!

def dfToMapOfRdd(df: DataFrame): RDD[Map[String, Any]] = {
    val result: RDD[Map[String, Any]] = df.rdd.map(row => {
        row.getValuesMap[Any](row.schema.fieldNames)
    })
    result
}

refs: https://stackoverflow.com/a/46156025/6494418

Upvotes: 0

iboss
iboss

Reputation: 486

You can use map function with pattern matching to do the job here

import org.apache.spark.sql.Row

dataFrame
  .map { case Row(name, age) => Map("name" -> name, "age" -> age) }

This will result in RDD[Map[String, Any]]

Upvotes: 6

Related Questions