Reputation: 14834
so I have
def gcd(a,b):
if a == 0:
return b
while b != 0:
if a > b:
a = a - b
else:
b = b - a
return a
but then when I call gcd(1,2) from the console, the error
Traceback (most recent call last): File "", line 1, in G.gcd(1,2) TypeError: gcd() takes exactly 2 arguments (3 given)
comes up....
which makes no sense at all since I only gave 2 arguments...
what did I do wrong?
allright so I deleted everything else and this is the only thing in my class:
import random
import math
class RSA:
def gcd(a,b):
if a == 0:
return b
while b != 0:
if a > b:
a = a - b
else:
b = b - a
return a
and the problem still persists
Upvotes: 1
Views: 251
Reputation: 20860
You could use the staticmethod decorator like this:
...
class RSA:
@staticmethod
def gcd(a,b):
...
Upvotes: 3
Reputation: 24802
There's already an accepted answer that solves the problem, but I'd like to point out, that (IMHO) a more idiomatic solution in Python would be to move the gcd
method outside the class to a plain function in the module.
I mean, it's a general purpose function. There's no reason in Python to encapsulate it in a class.
Upvotes: 1
Reputation: 156138
You have not posted all of your code. The code you didn't post looks like this:
class SomeClass:
def gcd(a,b):
if a == 0:
return b
while b != 0:
if a > b:
a = a - b
else:
b = b - a
return a
G = SomeClass()
G.gcd(1,2)
In python, when you define a class member function, the class instance is automatically passed to the function. Change your code to look like this:
class SomeClass:
def gcd(a,b):
if a == 0:
return b
while b != 0:
if a > b:
a = a - b
else:
b = b - a
return a
G = SomeClass()
G.gcd(1,2)
or better yet
def gcd( a,b):
if a == 0:
return b
while b != 0:
if a > b:
a = a - b
else:
b = b - a
return a
gcd(1,2)
and all will be well.
Upvotes: 5
Reputation: 1103
The code as you provide it, works fine in an interpreter. If you're calling gcd() as a method, you need to add "self" as first argument.
def gcd(self, a, b):
//...
Upvotes: 0
Reputation: 3574
if gcd is a method inside the class then you should define it as follows:
def gcd(self, a, b): etc.
Upvotes: 2
Reputation: 28090
You're calling the function as a method. Add "self" as the first argument and it'll work.
Upvotes: 7
Reputation: 38956
I suspect (since the error is in code you haven't provided) that gcd is actually a method and you are calling it as obj.gcd(a,b). This is translated as gcd(self, a, b) = 3 arguments. you need to define the function with the self argument.
Upvotes: 1
Reputation: 3213
I just threw that code into Python (in interpreter mode) and it seems to work just fine. Are you sure gcd isn't being redefined anywhere? Python will allow you to redefine functions without warning, if I remember correctly.
Upvotes: 0