mutantChickenHer0
mutantChickenHer0

Reputation: 223

Calculating the Average of Arguments

I'm working through "The Swift Programming Language" and the associated playground file.

I have one section where I am asked: Write a function that calculates the average of its arguments.

Given the context, this is my solution

func averageOf(numbers: Int...) -> Int {
    var sum = 0
    var countOfNumbers = 0
    for number in numbers {
        sum += number
        countOfNumbers += 1
    }
    var result: Double = Double(sum) / Double(countOfNumbers)
    return result
}

averageOf()
averageOf(10, 20, 30)

As you can see I had to call the result as a Double (thats the sum / countOfNumbers).

However, I can't return result in this case because I get an error about converting return expression of type "Double".

So then I tried to return Double(result) without success.

Why does this not work and how can I best understand what I am doing wrong here?

Upvotes: 0

Views: 1205

Answers (4)

Scott Gardner
Scott Gardner

Reputation: 8739

In Swift 3...

Using variadic parameter:

func averageOf(_ n: Int...) -> Double {
    return Double(n.reduce(0, +)) / Double(n.count)
}

averageOf() // nan
averageOf(0) // 0
averageOf(1, 2, 3, 4) // 2.5

Using array parameter (guarding for empty array):

func averageOf(_ n: [Int]) -> Double {
    guard n.count > 0 else { return 0 }
    return Double(n.reduce(0, +)) / Double(n.count)
}

averageOf([Int]()) // 0
averageOf([1, 2, 3, 4]) // 2.5

Upvotes: 1

dfrib
dfrib

Reputation: 73196

Just looking at your code (and not giving you another, out of many neat ways, to calculate this): your function expects return type Int, but you return Double.

Below follows your code with commented corrections:

func averageOf(numbers: Int...) -> Double {
    var sum = 0                  // ^ note the return type here
    var countOfNumbers = 0
    for number in numbers {
        sum += number
        countOfNumbers += 1
    }
    /* Note here: you declare 'result' to be of type Double */
    var result: Double = Double(sum) / Double(countOfNumbers)
    return result /* and return 'result'; hence returning a Double */
}

averageOf(10, 20, 30) // 20.0 <-- a Double

Upvotes: 3

Laurent Rivard
Laurent Rivard

Reputation: 519

If you look at your function definition, it is return an Int (the return parameter is after the ->.

func averageOf(numbers: Int...) -> Int

Change it to Double if you want to return the Double or cast your result to an Int before returning

Upvotes: 1

jtbandes
jtbandes

Reputation: 118731

Your function is declared as returning an Int:

func averageOf(numbers: Int...) -> Int {

You can't return a Double because it doesn't match this signature. You could either convert back to an Int before returning, or change the function signature.

Upvotes: 0

Related Questions