Alexandre Dieulot
Alexandre Dieulot

Reputation: 524

Incomprehensible errors in Swift

This works:

func averageOf(numbers: Int...) -> Float {
    var sum = 0
    for number in numbers {
        sum += number
    }
    return Float(sum) / Float(numbers.count)
}
averageOf() // (not a number)
averageOf(42, 597, 12) // (217.0)

But this doesn't:

func averageOf(numbers: Int...) -> Float {
    var sum = 0
    for number in numbers {
        sum += number
    }
    return Float(sum / numbers.count)
}
averageOf()
averageOf(42, 597, 12)

It gives me this error on the } line:

Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)

I stumbled upon another question with the same first and second snippets of code and the author apparently doesn't get the same errors.

Let's remove that cast:

func averageOf(numbers: Int...) -> Float {
    var sum = 0
    for number in numbers {
        sum += number
    }
    return sum / numbers.count
}
averageOf()
averageOf(42, 597, 12)

It gives me this error, on the division sign:

Cannot invoke '/' with an argument list of type '(@lvalue Int, Int)'

If I then change the return type of the function to Int:

func averageOf(numbers: Int...) -> Int {
    var sum = 0
    for number in numbers {
        sum += number
    }
    return sum / numbers.count
}
averageOf()
averageOf(42, 597, 12)

I get the same EXC_BAD_INSTRUCTION error.

If I cast only numbers.count:

func averageOf(numbers: Int...) -> Int {
    var sum = 0
    for number in numbers {
        sum += number
    }
    return sum / Float(numbers.count)
}
averageOf()
averageOf(42, 597, 12)

I get this error on the division sign:

Cannot invoke 'init' with an argument list of type '(@lvalue Int, $T5)'

I also get this error if I change the return type back to Float.

All of this makes no sense to me. Is it Xcode going postal, or have I missed something subtle?

Upvotes: 0

Views: 125

Answers (2)

Para
Para

Reputation: 3711

As already explained, the problem in the second example is due to invoking averageOf() without arguments, which results in a division by zero. However, the first averageOf() works, again without arguments, why? Let me add a few more details.

In the first case you reported, you get no error and averageOf() works, because you are casting the two Int operands to Float before the division.
In the world of Float numbers, 0.0 is only an approximation of 0. If you try 0.0 / 0.0 in a Playground, you won't get an error as a result, the output, instead, will be not a number.

In the second case, however, you're trying to divide 0 by 0 before casting to Float. Therefore, we are still in the realm of Int numbers when the division is performed. The result is an error, due to a division by zero (no approximation involved here). If you try 0 / 0 in a Playground, you'll get an error.

All the other cases not explained by the Int vs Float behavior, are due to the fact that Swift requires to you explicitly cast between types, even when another language would let you cast the operands implicitly.

Upvotes: 3

Marcus Rossel
Marcus Rossel

Reputation: 3258

This error occures, because of your function-call averageOf().
If you pass no values, to the variadic parameter numbers it creates an empty array.
It's count property therefore returns 0. And you can't devide by 0. This is also the reason why it says BAD_INSTRUCTION.

If you remove averageOf() from your second code example it works.

Upvotes: 1

Related Questions