Monolith
Monolith

Reputation: 1147

Adding custom number class to Python int results in "TypeError"

Is there a way to be able to add an instance of a Python class to an integer (default int class). For example, if I have a class with magic methods:

class IntType:
    def __init__(self, value):
        self.value = value

    def __add__(self, other):
        return self.value + other

# Works
print(IntType(10) + 10)

# Doesn't Work
print(10 + IntType(10))

I can't add my IntType to the built-in int class. If I try to add IntType to an integer, I get the following error:

Traceback (most recent call last):
  File "test.py", line 8, in <module>
    print(10 + IntType(10))
TypeError: unsupported operand type(s) for +: 'int' and 'IntType'

The only way I can think of to get this to work is to somehow change the int class' __add__ method. If you are wondering why I don't just add the int to the IntType (e.g. IntType(10) + 10) is because I need this to work for all operators, for example subtraction (where the order is important). I'm using Python 3.

Upvotes: 1

Views: 215

Answers (1)

jfaccioni
jfaccioni

Reputation: 7509

Implementing reverse addition (__radd__) for your IntType class should fix this:

>>> class IntType: 
...     def __init__(self, value): 
...         self.value = value 
... 
...     def __add__(self, other): 
...        return self.value + other 
...
...     def __radd__(self, other): 
...         return self.value + other 

>>> IntType(10) + 10 == 10 + IntType(10)
True

This is the operation that Python tries to use when all else fails (i.e, int.__add__(IntType) is not a defined operation).

Upvotes: 3

Related Questions