CClarke
CClarke

Reputation: 586

Why won't the exp function work in pyspark?

I'm trying to calculate odds ratios from the coefficients of a logistic regression but I'm encountering a problem best summed up by this code:

import pyspark.sql.functions as F 
F.exp(1.2)

This fails with py4j.Py4JException: Method exp([class java.lang.Double]) does not exist

An integer fails similarly. I don't get how a Double can cause a problem for the exp function?

Upvotes: 0

Views: 3297

Answers (2)

enamya
enamya

Reputation: 111

As @pissall mentionned, the pyspark.sql.functions.exp takes col objects as parameter, but you can use the pyspark.sql.functions.lit (introduced in version 1.3.0) to create a col object of a literal value.

from pyspark.sql.functions import exp, lit

df = df.withColumn("exp_1", exp(lit(1)))

Upvotes: 0

pissall
pissall

Reputation: 7419

If you have a look at the documentation for pyspark.sql.functions.exp(), it takes an input of a col object. Hence it will not work for a float value such as 1.2.

Create a dataframe or a Column object which you can use in F.exp()

Example would be:

df = df.withColumn("exp_x", F.exp(F.col("some_col_named_x")))

Upvotes: 2

Related Questions