Global Warrior
Global Warrior

Reputation: 5110

Converting a column of Array of Array to Array type in Scala/Spark

I have a data frame which looks like below:

+---+-----+--------------------------------------------------------------------------------------------------+------+
|uid|label|features                                                                                          |weight|
+---+-----+--------------------------------------------------------------------------------------------------+------+
|1  |1.0  |[WrappedArray([animal_indexed,2.0,animal_indexed]), WrappedArray([talk_indexed,3.0,talk_indexed])]|1     |
|2  |0.0  |[WrappedArray([animal_indexed,1.0,animal_indexed]), WrappedArray([talk_indexed,2.0,talk_indexed])]|1     |
|3  |1.0  |[WrappedArray([animal_indexed,0.0,animal_indexed]), WrappedArray([talk_indexed,1.0,talk_indexed])]|1     |
|4  |2.0  |[WrappedArray([animal_indexed,0.0,animal_indexed]), WrappedArray([talk_indexed,0.0,talk_indexed])]|1     |
+---+-----+--------------------------------------------------------------------------------------------------+------+

and the schema is

root
 |-- uid: integer (nullable = false)
 |-- label: double (nullable = false)
 |-- features: array (nullable = false)
 |    |-- element: array (containsNull = true)
 |    |    |-- element: struct (containsNull = true)
 |    |    |    |-- name: string (nullable = true)
 |    |    |    |-- value: double (nullable = false)
 |    |    |    |-- term: string (nullable = true)
 |-- weight: integer (nullable = false)

But I want to convert the features from Array[Array] to just Array i.e. flatMap a column array into the same column to get a schema like

  root
     |-- uid: integer (nullable = false)
     |-- label: double (nullable = false)
     |-- features: array (nullable = false)
     |    |    |-- element: struct (containsNull = true)
     |    |    |    |-- name: string (nullable = true)
     |    |    |    |-- value: double (nullable = false)
     |    |    |    |-- term: string (nullable = true)
     |-- weight: integer (nullable = false)

Thanks in advance.

Upvotes: 0

Views: 1535

Answers (1)

illak zapata
illak zapata

Reputation: 66

You should read your data as a Dataset with schema:

case class Something(name: String, value: Double, term: String)
case class MyClass(uid: Int, label: Double, array: Seq[Seq[Something]], weight: Int)

then use UDF like this:

val flatUDF = udf((list: Seq[Seq[Something]]) => list.flatten)

val flattedDF = myDataFrame.withColumn("flatten", flatUDF($"features"))

example for reading dataset:

val myDataFrame = spark.read.json(path).as[MyClass]

Hope this helps.

Upvotes: 1

Related Questions