Reputation: 47
I have a simple 2D array of integers named "round", with 5 elements on the horizontal and a variable number of "rows".
I want to calculate the total for each "column". Here's my code:
var totals = [Int] ()
for column in 0...4 {
for row in 0...round.count-1 {
totals[column] = round[row][column] + totals[column] }
}
I get a Playground "Fatal error: Index out of range"
In trying various options, it appears to be the "totals[column] = " assignment that throws the error, but I cannot determine why?
Upvotes: 0
Views: 220
Reputation: 2355
The problem is totals[column]
since totals
is an empty array it has no element at the index column
.
It's similar to the following attempt:
var arr = [Int]()
arr[0] = 1 // <- fails, because there is no element at index 0
To fix this issue, you can initialize totals
with zeros for the number of elements you want to store in the array:
var totals = Array(repeating: 0, count: round[0].count)
Or even simpler (but less dynamic):
var totals = [0, 0, 0, 0, 0]
Upvotes: 1