practicalGuy
practicalGuy

Reputation: 1328

TypeError: Invalid argument, not a string or column: [79, -1, -1] of type <class 'list'> column literals use 'lit' 'array' 'struct' or 'create_map'

I have a problem in PySpark UDF, it is throwing below error

PythonException: An exception was thrown from a UDF: 'TypeError: Invalid argument, not a string or column: [79, -1, -1] of type <class 'list'>. For column literals, use 'lit', 'array', 'struct' or 'create_map' function.

I'm trying to get a number according to the precedence array, and their precedence is reduced as they from left to right.

precedence = [1,2,11,12,13,20,20131,200,202,203,210,220,223,226,235,236,237,242,244,245,247,253,254,257,259,260,262,278,283,701,20107,20108,20109,20112,20115,20123,20135,20141,20144,20152,20162,20163,20167,20168,20169,20170,20171,20172,20173,20174,20175,14,211,213,258,270,273,274,275,277,280,281,287,288,20120,20122,20124,20125,20126,20130,20133,20136,20137,20138,20140,20142,20143,20154,20155,20156,20157]
reverse_order = precedence[::-1]

def get_p(row):
    if (row!=None) and (row!="null"):
        temp = row.split(",")
        test = []
        for i in temp:
            if (i.find('=')!=-1):
                i = i.split('=')[0]
            if int(i) in reverse_order:
                test.append(reverse_order.index(int(i)))
            else:
                test.append(-1)
        if max(test)!=-1:
            return reverse_order[max(test)]
        return -999
    else:
        return None

get_array = udf(get_p, IntegerType())

bronze_table = bronze_table.withColumn("precedence", get_array("event_list"))
bronze_table.select("event_list","precedence").show(100, False)

Here are the sample records,

+---------------------------------------------------------------------------------------+
|event_list                                                                             |
+---------------------------------------------------------------------------------------+
|276,100,101,202,176                                                                    |
|276,100,2,124,176                                                                      |
|246,100,101,257,115,116,121,123,124,125,135,138,145,146,153,167,168,170,171,173,189,191|
|246,100,101,278,123,124,135,170,189,191                                                |
|20131=16,20151,100,101,102,115,116,121,123,124,125,135,138,145,146,153,168,170,171     |
|null                                                                                   |
|20107=9,20151,100,101,102,123,124,135,170,189,191                                      |
|20108=3,20151,100,101,102,123,124,125,135,170,171,189,191                              |
|null                                                                                   |
+---------------------------------------------------------------------------------------+

What I am expecting

+---------------------------------------------------------------------------------------+----------+
|event_list                                                                             |precedence|
+---------------------------------------------------------------------------------------+----------+
|276,100,101,202,176                                                                    |202       |
|276,100,2,124,176                                                                      |2         |
|246,100,101,257,115,116,121,123,124,125,135,138,145,146,153,167,168,170,171,173,189,191|257       |
|246,100,101,278,123,124,135,170,189,191                                                |278       |
|20131=16,20151,100,101,102,115,116,121,123,124,125,135,138,145,146,153,168,170,171     |20131     |
|null                                                                                   |null      |
|20107=9,20151,100,101,102,123,124,135,170,189,191                                      |20107     |
|20108=3,20151,100,101,102,123,124,125,135,170,171,189,191                              |20108     |
|null                                                                                   |null      |
+---------------------------------------------------------------------------------------+----------+

my UDF is working as expected in python, but not working in pyspark. I request someone to please help me on this.

Upvotes: 3

Views: 11695

Answers (2)

practicalGuy
practicalGuy

Reputation: 1328

Thank you everyone, who tried to solve the problem. I'm able to solve it using this link https://stackoverflow.com/a/63654269/6187792

The issue with max function, as specified in the link.

Here is the updated code.

import builtins as p
def get_p(row):
    if (row!=None) and (row!="null"):
        temp = row.split(",")
        test = []
        for i in temp:
            if (i.find('=')!=-1):
                i = i.split('=')[0]
            if int(i) in reverse_order:
                test.append(reverse_order.index(int(i)))
            else:
                test.append(-1)
        if p.max(test)!=-1:
            return reverse_order[p.max(test)]
        return None
    else:
        return None

Upvotes: 2

pltc
pltc

Reputation: 6082

null in PySpark dataframe is None in Python, so this condition if row!="null": is incorrect. Try if row!= None: instead.

However, your get_p function doesn't run well to me, for example:

get_p('276,100,101,202,176')
# output: 2
# expected: 202

get_p('20131=16,20151,100,101,102,115,116,121,123,124,125,135,138,145,146,153,168,170,171')
# output: Exception `invalid literal for int() with base 10: ''`
# expected: 20131

Upvotes: 0

Related Questions