Reputation: 17303
I have a simple function that returns an array of tuples
func findNodeNeighbors(node: Node) -> [(node: Node, distance: Double)] {
var neighbors = [(node: Node, distance: Double)]()
var nodeLinks = linksWith(node)
for link in nodeLinks {
neighbors.append((node: link.otherNodeLinkedWith(node), distance: link.length))
}
return neighbors
}
But this turns out is an error Invalid use of () to call a clue of non-function type
on the first line of the function body.
If instead I declare the type for neighbors
explicitly, everything is fine.
var neighbors: [(node: Node, distance: Double)] = []
How come?
I've read that it is preferred to declare arrays by initialising them and allowing for implicit type inference.
Upvotes: 3
Views: 76
Reputation: 2400
what if I want an empty array as the initial value?
I am not 100% certain - but I think that another way you can quickly get around this issue currently is by declaring the tuple using a typealias
.
Eg:
typealias Test = (Node, Double)
func findNodeNeighbors(node: Node) -> [Test] {
var neighbors = [Test]()
//etc
}
Upvotes: 2
Reputation: 40965
Pretty certain this is a bug in Swift's parser, specifically to do with the [Type]
sugar in combination with named tuples.
var neighbors = Array<(node: Node, distance: Double)>()
(which should be identical to [(node: Node, distance: Double)]()
) works fine.
edit: looks like the dictionary equivalent has the same problem
Works fine:
var d = Dictionary<Int,(x: Int, y: Int)>()
Busted:
var d = [Int:(x: Int, y: Int)]()
Upvotes: 4
Reputation: 5899
You should initialise the array using default values:
var neighbors = [(node: node, distance: 0.0)]
Upvotes: 0