Dipali Deshmukh
Dipali Deshmukh

Reputation: 11

Spark Scala: Convert Map into Row object

I want to convert a Scala map into Row object (basically what Row(**dict) does in python I have to achieve it in Scala spark).

input : Map(com.project.name -> "A", com.project.age -> 23 )
output : Row(com.project.name="A", com.project.age = 23)

Please help.

Upvotes: 1

Views: 1575

Answers (2)

Raphael Roth
Raphael Roth

Reputation: 27383

You can use Row.fromSeq:

val m = Map("com.project.name" -> "A", "com.project.age" -> "23")
val row = Row.fromSeq(m.toSeq)

or alternatively Row(m.toSeq:_*)

both giving [(com.project.name,A),(com.project.age,23)]

Upvotes: 3

Nikunj Kakadiya
Nikunj Kakadiya

Reputation: 3008

You can convert map into the dataframe as follows :

import org.apache.spark.sql.types._
import org.apache.spark.sql.functions._
val input : Map[String,String] = Map("com.project.name" -> "A", "com.project.age" -> "23")
val df = input.tail
  .foldLeft(Seq(input.head._2).toDF(input.head._1))((acc,curr) => 
acc.withColumn(curr._1,lit(curr._2)))

Now if you want to get the Row from the Dataframe you can get as follows :

val row = df.first

And if you want to see the names of the column you can get that as follows :

val columns = df.columns

Upvotes: 0

Related Questions