Georg Heiler
Georg Heiler

Reputation: 17724

use spark SQL udf in dataframe API

How can I use a UDF which works great in spark like

sparkSession.sql("select * from chicago where st_contains(st_makeBBOX(0.0, 0.0, 90.0, 90.0), geom)").show

taken from from http://www.geomesa.org/documentation/user/spark/sparksql.html via spark`s more typesafe scala dataframe API?

Upvotes: 0

Views: 344

Answers (2)

Karan Gupta
Karan Gupta

Reputation: 61

Oliviervs I think he's looking for something different. I think Georg wants to use the udf by string in the select api of the dataframe. For example:

val squared = (s: Long) => {
  s * s
}
spark.udf.register("square", squared)

df.select(getUdf("square", col("num")).as("newColumn")) // something like this

Question in hand is if there exists a function called getUdf that could be utilized to retrieve a udf registered via string. Georg, Is that right?

Upvotes: 0

Oliviervs
Oliviervs

Reputation: 51

If you have created a function, you can register the created UDF using:

sparksession.sqlContext.udf.register(yourFunction)

I hope this helps.

Upvotes: 1

Related Questions