Reputation: 13
I am learning object oriented concepts in python. Below, I have created a class and a method in it. I am trying to find the minimum of a list by directly calling the min()
function and also by calling the class method
findMin()
.
The below code gives me an error:
min() arg is an empty sequence
Please tell me what I am missing here.
class Solution:
nums = list()
def findMin(self, nums):
self.nums.sort()
out = min(self.nums)
return out
x = [4,5,6,7,0,1,2]
y = Solution()
print min(x)
print y.findMin(x)
print len(x)
print type(y)
print dir(y)
Upvotes: 0
Views: 1383
Reputation: 1887
You are confusing nums
with self.nums
.
When you write:
nums=list()
You are setting a variable on the class.
When you write:
def findMin(self, nums):
You are receiving the parameter in a local variable, nums
.
When you then write self.nums
on the next two lines, you're referencing the instance's nums
variable, which was initialized to the value of the class's nums
variable, which was the empty list.
As such, you are essentially sorting an empty list and then trying to find its minimum. This isn't going to work, since there's no value in an empty list to find the minimum of.
Hence, the error that you see:
ValueError: min() arg is an empty sequence
To solve this, use nums
rather than self.nums
, because then you'll be referencing the parameter rather than the instance field.
Upvotes: 1