Yann Moisan
Yann Moisan

Reputation: 8281

Is it possible to use Option with spark UDF

I'd like to use Option as input type for my functions.

udf((oa: Option[String], ob: Option[String])) => …

to handle null values in a more functional way.

Is there a way to do that ?

Upvotes: 2

Views: 1586

Answers (1)

zero323
zero323

Reputation: 330093

As far as I know it is not directly possible. Nothing stops you wrapping arguments with Options:

udf((oa: String, ob: String) => (Option(oa), Option(ob)) match {
  ...
})

using Dataset encoders:

val df = Seq(("a", None), ("b", Some("foo"))).toDF("oa", "ob")

df.as[(Option[String], Option[String])]

or adding some implicit conversions:

implicit def asOption[T](value: T) : Option[T] = Option(value)

def foo(oa: Option[String], ob: Option[String]) = {
  oa.flatMap(a => ob.map(b => s"$a - $b"))
}

def wrap[T, U, V](f: (Option[T], Option[U]) => V) = 
  (t: T, u: U) => f(Option(t), Option(u))

val foo_ = udf(wrap(foo))
df.select(foo_($"oa", $"ob"))

Upvotes: 5

Related Questions