PopKernel
PopKernel

Reputation: 4260

How to debug custom geometry in SceneKit with Swift

I'm trying to learn how to create custom geometry in SceneKit. However, I've tried to make a triangle and it's not showing anything. I'm at a loss as to how to debug this. Is there a way to figure out if the triangle is valid? I just don't know where to start.

For reference, the playground code in question is below. Note that it is written against Swift 4, but the changes between Swift 3 and Swift 4 are so minor that getting it to compile in Swift 3 is trivial.

import UIKit
import SceneKit

let points = [
    SCNVector3Make(0, 0, 0),
    SCNVector3Make(0, 10, 0),
    SCNVector3Make(10, 0, 0),
]
let indices = [
    0,2,1,
]

let vertexSource = SCNGeometrySource(vertices: points)
let element = SCNGeometryElement(indices: indices, primitiveType: .triangles)
let geo = SCNGeometry(sources: [vertexSource], elements: [element])

Upvotes: 2

Views: 808

Answers (2)

jlsiewert
jlsiewert

Reputation: 3554

When creating custom SCNGeometryElements the type of the indices needs to be Int161. I don't think this documented anywhere. But when you change the declaration of the indices too

let indices: [Int16] = [
    0, 2, 1
]

the triangle should appear.


Edit

1: As @mnuages has pointed out, SceneKit supports only 32bit Integers as indices. So you can use Int8, Int16 and Int32.

Upvotes: 10

mnuages
mnuages

Reputation: 13462

From the documentation for the Int type we get that

On 32-bit platforms, Int is the same size as Int32, and on 64-bit platforms, Int is the same size as Int64.

So in a Swift Playground you'll end up with Int64.

That's unfortunate because SceneKit (and Metal) only support UInt8, UInt16 and UInt32. In the debugger console you should be able to see SceneKit warn about the unsupported 64 bits wide index.

Upvotes: 6

Related Questions