Reputation: 1758
I have the following function that flattens a sequence of maps of string to double. How can I make type string to double generic?
val flattenSeqOfMaps = udf { values: Seq[Map[String, Double]] => values.flatten.toMap }
flattenSeqOfMaps: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,MapType(StringType,DoubleType,false),Some(List(ArrayType(MapType(StringType,DoubleType,false),true))))
I need something like,
val flattenSeqOfMaps[S,D] = udf { values: Seq[Map[S, D]] => values.flatten.toMap }
Thanks.
Edit 1: I'm using spark 2.3. I am aware of higher order functions in spark 2.4
Edit 2: I got a bit closer. What do I need in place of f _
in val flattenSeqOfMaps = udf { f _}
. Please compare joinMap
type signature and flattenSeqOfMaps
type signature below
scala> val joinMap = udf { values: Seq[Map[String, Double]] => values.flatten.toMap }
joinMap: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,MapType(StringType,DoubleType,false),Some(List(ArrayType(MapType(StringType,DoubleType,false),true))))
scala> def f[S,D](values: Seq[Map[S, D]]): Map[S,D] = { values.flatten.toMap}
f: [S, D](values: Seq[Map[S,D]])Map[S,D]
scala> val flattenSeqOfMaps = udf { f _}
flattenSeqOfMaps: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,MapType(NullType,NullType,true),Some(List(ArrayType(MapType(NullType,NullType,true),true))))
Edit 3: the following code worked for me.
scala> val flattenSeqOfMaps = udf { f[String,Double] _}
flattenSeqOfMaps: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,MapType(StringType,DoubleType,false),Some(List(ArrayType(MapType(StringType,DoubleType,false),true))))
Upvotes: 1
Views: 794
Reputation: 1758
The following code worked for me.
scala> def f[S,D](values: Seq[Map[S, D]]): Map[S,D] = { values.flatten.toMap}
f: [S, D](values: Seq[Map[S,D]])Map[S,D]
scala> val flattenSeqOfMaps = udf { f[String,Double] _}
flattenSeqOfMaps: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,MapType(StringType,DoubleType,false),Some(List(ArrayType(MapType(StringType,DoubleType,false),true))))
Upvotes: 2
Reputation: 4151
While you could define your function as
import scala.reflect.runtime.universe.TypeTag
def flattenSeqOfMaps[S : TypeTag, D: TypeTag] = udf {
values: Seq[Map[S, D]] => values.flatten.toMap
}
and then use specific instances:
val df = Seq(Seq(Map("a" -> 1), Map("b" -> 1))).toDF("val")
val flattenSeqOfMapsStringInt = flattenSeqOfMaps[String, Int]
df.select($"val", flattenSeqOfMapsStringInt($"val") as "val").show
+--------------------+----------------+
| val| val|
+--------------------+----------------+
|[[a -> 1], [b -> 1]]|[a -> 1, b -> 1]|
+--------------------+----------------|
it is also possible to use built-in functions, without any need for explicit generics:
import org.apache.spark.sql.functions.{expr, flatten, map_from_arrays}
def flattenSeqOfMaps_(col: String) = {
val keys = flatten(expr(s"transform(`$col`, x -> map_keys(x))"))
val values = flatten(expr(s"transform(`$col`, x -> map_values(x))"))
map_from_arrays(keys, values)
}
df.select($"val", flattenSeqOfMaps_("val") as "val").show
+--------------------+----------------+
| val| val|
+--------------------+----------------+
|[[a -> 1], [b -> 1]]|[a -> 1, b -> 1]|
+--------------------+----------------+
Upvotes: 2