Reputation: 5372
In the Swift "Tour" documentation, there's an exercise where you build on the following function to average a set of numbers:
func sumOf(numbers: Int...) -> Int {
var sum = 0
for number in numbers {
sum += number
}
return sum
}
I can make this work using something like the following:
func averageOf(numbers: Double...) -> Double {
var sum: Double = 0, countOfNumbers: Double = 0
for number in numbers {
sum += number
countOfNumbers++
}
var result: Double = sum / countOfNumbers
return result
}
My question is, why do I have to cast everything as a Double to make it work? If I try to work with integers, like so:
func averageOf(numbers: Int...) -> Double {
var sum = 0, countOfNumbers = 0
for number in numbers {
sum += number
countOfNumbers++
}
var result: Double = sum / countOfNumbers
return result
}
I get the following error: Could not find an overload for '/' that accepts the supplied arguments
Upvotes: 48
Views: 83342
Reputation: 21
Way late to the party, but the reason is because when dividing two INTs in Swift the result is always an INT.
The Compiler does this by truncating the value after the decimal-point (i.e. 5 / 2 = 2; the actual result should be 2.5).
To get the true average (the non-truncated value) you need to cast to a Double, so that the value after the decimal is retained. Otherwise, it will be lost.
Upvotes: 2
Reputation: 24041
that may be helpful:
func averageOf(numbers: Int...) -> Double {
var sum = 0, countOfNumbers = 0
for number in numbers {
sum += number
countOfNumbers++
}
var result: Double = Double(sum) / Double(countOfNumbers)
return result
}
overloading the /
operator can be also a solution, like in Swift 4.x that would look like:
infix operator /: MultiplicationPrecedence
public func /<T: FixedWidthInteger>(lhs: T, rhs: T) -> Double {
return Double(lhs) / Double(rhs)
}
Upvotes: 13
Reputation: 61
I don't find a necessity for a Forced Division. Normal division operator works though. In the following code,
func average(numbers:Int...)->Float{
var sum = 0
for number in numbers{
sum += number
}
var average: Float = 0
average = (Float (sum) / Float(numbers.count))
return average
}
let averageResult = average(20,10,30)
averageResult
Here, two float values are divided, of course after type casting as i am storing the result in a float variable and returning the same.
Note: I have not used an extra variable to count the number of parameters. "numbers" are considered as array, as the functions in Swift take a variable number of arguments into an array. "numbers.count" (Similar to Objective C) will return the count of the parameters being passed.
Upvotes: 4
Reputation: 11
There's no reason to manually track of the number of arguments when you can just get it directly.
func sumOf(numbers: Int...) -> Int {
var sum = 0
for number in numbers {
sum += number
}
let average = sum &/ numbers.count
return average
}
sumOf()
sumOf(42, 597, 12)
Upvotes: 1
Reputation: 11
Try this but notice swift doesn't like to divide by integers that are initialized to zero or could become zero so you must use &/ to force the division. this code is a little verbose but it is easy to understand and it gives the correct answer in integer not floating point or double
func sumOf(numbers: Int...) -> Int {
var sum = 0
var i = 0
var avg = 1
for number in numbers {
sum += number
i += 1
}
avg = sum &/ i
return avg
}
sumOf()
sumOf(42, 597, 12)
Upvotes: 1
Reputation: 9418
You are assigning the output of /
to a variable of type Double
, so Swift thinks you want to call this function:
func /(lhs: Double, rhs: Double) -> Double
But the arguments you're passing it are not Double
s and Swift doesn't do implicit casting.
Upvotes: 15
Reputation: 17460
The OP seems to know how the code has to look like but he is explicitly asking why it is not working the other way.
So, "explicitly" is part of the answer he is looking for: Apple writes inside the "Language Guide" in chapter "The Basics" -> "Integer and Floating-Point Conversion":
Conversions between integer and floating-point numeric types must be made explicit
Upvotes: 51
Reputation: 485
you just need to do this:
func averageOf(numbers: Int...) -> Double {
var sum = 0, countOfNumbers = 0
for number in numbers {
sum += number
countOfNumbers++
}
var result: Double = Double(sum) / Double(countOfNumbers)
return result
}
Upvotes: 33