Reputation: 3
I'm making the average function assignment from swift tutorial. When I write code like this:
func AVG(numbers: Int...) -> Double {
var i=0, sum=0; var avg: Double
for number in numbers {
i+=1
sum+=number
avg = sum/i
}
return avg
}
let Average = AVG(numbers: 3,4,5)
print (Average)
It displays an error message showing binary operator "/" cannot be applied to two int operands.
when i perform a simple division like
let x = 5
let y = 2
let quo = x/y
it works.
What am I doing wrong in the function?
Upvotes: 0
Views: 108
Reputation: 285160
There are two issues in the AVG function – btw. function names are supposed to start with a lowercase letter.
A playground displays clear error messages
variable 'avg' used before being initialized
Solution:
var avg = 0.0
cannot assign value of type 'Int' to type 'Double'
Solution:
avg = Double(sum) / Double(i)
PS: An alternative using the key-value coding operator @avg
func avg(numbers: Int...) -> Double {
return (numbers as NSArray).value(forKeyPath: "@avg.self") as! Double
}
Upvotes: 0
Reputation: 19822
You declared avg
as Double type.
numbers
has Int
as type, and i=0
makes it Int
as well.
When you are doing sum/i
it's Int/Int
and you try to assign it to Double
.
Swift is not doing conversion automatically - it forces you to convert data types manually.
You can fix it by simply converting both those variables to Double.
Keep in mind that you need to convert Int before doing division - otherwise result will cut the reminder as Int cannot have them. Credits to @martin
Also another problem is that if your numbers array is empty, you may try to return not initialized variable avg
.
Working code below:
func AVG(numbers: Int...) -> Double {
var i=0, sum=0; var avg: Double = 0
for number in numbers {
i+=1
sum+=number
avg = Double(sum)/Double(i)
}
return avg
}
let Average = AVG(numbers: 3,4,5)
print (Average)
Upvotes: 1