Alejandro A
Alejandro A

Reputation: 1190

Pyspark UDF - TypeError: 'module' object is not callable

I am trying to run the following code based on some tutorial I found online:

import pandas as pd
from pyspark.sql import SparkSession
from pyspark.sql import functions
from pyspark.sql import udf
df_pd = pd.DataFrame(
data={'integers': [1, 2, 3],
 'floats': [-1.0, 0.5, 2.7],
 'integer_arrays': [[1, 2], [3, 4, 5], [6, 7, 8, 9]]}
)

df = spark.createDataFrame(df_pd)
df.show()

def square(x):
    return x**2
from pyspark.sql.types import IntegerType
square_udf_int = udf(lambda z: square(z), IntegerType())

But when I run the last line I get the following error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'module' object is not callable

I am using spark 2.3.3 on Hadoop 2.7.

Thanks

Upvotes: 3

Views: 4462

Answers (1)

Exorcismus
Exorcismus

Reputation: 2482

it seems you're importing from pyspark.sql while it should be pyspark.sql.functions like...

import pyspark.sql.functions as F

     udf_fun = F.udf (lambda..., Type())

Upvotes: 10

Related Questions