Peter Pohlmann
Peter Pohlmann

Reputation: 1508

SceneKit LiDAR features for object occlusion

Object occlusion with LiDAR works in RealityKit with the scene understanding option. Basically the scanned geometry is used for the occlusion.

arView.environment.sceneUnderstanding.options.insert(.occlusion)

Sadly this is not included in SceneKit. While it is still possible to get the scanned geometry from LiDAR via ARMeshAnchor, the object occlusion has to de done by hand from this geometry. Here is discussion on it: Apple Forum

Is there already a solution for this approach?

Upvotes: 5

Views: 1335

Answers (1)

Haris
Haris

Reputation: 1862

I acheived object occlusion in SceneKit by doing something like this:

//Before starting session, put this option in configuration
if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
        configuration.sceneReconstruction = .mesh
} else {
        // Handle device that doesn't support scene reconstruction
}
// Run the view's session
sceneView.session.run(configuration)

You will get scene reconstruction mesh in this method, so implement it like this

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    guard let meshAnchor = anchor as? ARMeshAnchor else {
            return nil
        }

    let geometry = createGeometryFromAnchor(meshAnchor: meshAnchor)

    //apply occlusion material
    geometry.firstMaterial?.colorBufferWriteMask = []
    geometry.firstMaterial?.writesToDepthBuffer = true
    geometry.firstMaterial?.readsFromDepthBuffer = true
        

    let node = SCNNode(geometry: geometry)
    //change rendering order so it renders before  our virtual object
    node.renderingOrder = -1
    
    return node
}

Update your node when mesh updates

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let meshAnchor = anchor as? ARMeshAnchor else {
            return
        }
    let geometry = createGeometryFromAnchor(meshAnchor: meshAnchor)

        // Optionally hide the node from rendering as well
        geometry.firstMaterial?.colorBufferWriteMask = []
        geometry.firstMaterial?.writesToDepthBuffer = true
        geometry.firstMaterial?.readsFromDepthBuffer = true
        

    node.geometry = geometry
}

Rest of methods

// Taken from https://developer.apple.com/forums/thread/130599
func createGeometryFromAnchor(meshAnchor: ARMeshAnchor) -> SCNGeometry {
    let meshGeometry = meshAnchor.geometry
    let vertices = meshGeometry.vertices
    let normals = meshGeometry.normals
    let faces = meshGeometry.faces
    
    // use the MTL buffer that ARKit gives us
    let vertexSource = SCNGeometrySource(buffer: vertices.buffer, vertexFormat: vertices.format, semantic: .vertex, vertexCount: vertices.count, dataOffset: vertices.offset, dataStride: vertices.stride)
    
    let normalsSource = SCNGeometrySource(buffer: normals.buffer, vertexFormat: normals.format, semantic: .normal, vertexCount: normals.count, dataOffset: normals.offset, dataStride: normals.stride)
    // Copy bytes as we may use them later
    let faceData = Data(bytes: faces.buffer.contents(), count: faces.buffer.length)
    
    // create the geometry element
    let geometryElement = SCNGeometryElement(data: faceData, primitiveType: primitiveType(type: faces.primitiveType), primitiveCount: faces.count, bytesPerIndex: faces.bytesPerIndex)
    
    return SCNGeometry(sources: [vertexSource, normalsSource], elements: [geometryElement])
}

func primitiveType(type: ARGeometryPrimitiveType) -> SCNGeometryPrimitiveType {
        switch type {
            case .line: return .line
            case .triangle: return .triangles
        default : return .triangles
        }
}

Upvotes: 4

Related Questions