Eros Cai
Eros Cai

Reputation: 55

Is it possible to track a face and render it in RealityKit ARView?

Apple document says you can set userFaceTrackingEnabled to simultaneous front and back camera. After add ARView and setting configuration correctly, i can confirm that ARSessionDelegate functions will be called normally like below:

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // triggerd
    }
}

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // triggerd
    }
}

So now i have ARFaceAnchor object, what should i do next? Is it possible to render this ARFaceAnchor using RealityKit? Or can only be rendered by SceneKit? Because all examples on internet are implemented using SceneKit.

Upvotes: 4

Views: 1308

Answers (2)

Andy Jazz
Andy Jazz

Reputation: 58043

If you wanna use RealityKit rendering technology you should use its own anchors.

So, for RealityKit face tracking experience you just need:

AnchorEntity(AnchoringComponent.Target.face)

And you don't even need session(_:didAdd:) and session(_:didUpdate:) instance methods in case you're using Reality Composer scene.

If you prepare a scene in Reality Composer .face type of anchor is available for you at start. Here's how non-editable hidden Swift code in .reality file looks like:

public static func loadFace() throws -> Facial.Face {

    guard let realityFileURL = Foundation.Bundle(for: Facial.Face.self).url(forResource: "Facial", 
                                                                          withExtension: "reality") 
    else {
        throw Facial.LoadRealityFileError.fileNotFound("Facial.reality")
    }

    let realityFileSceneURL = realityFileURL.appendingPathComponent("face", isDirectory: false)
    let anchorEntity = try Facial.Face.loadAnchor(contentsOf: realityFileSceneURL)
    return createFace(from: anchorEntity)
}

If you need a more detailed info about anchors, please read this post.

P.S.

But, at the moment, there's one unpleasant problem – if you're using a scene built in Reality Composer, you can use only one type of anchor at a time (horizontal, vertical, image, face, or object). Hence, if you need to use ARWorldTrackingConfig along with ARFaceTrackingConfig – don't use Reality Composer scenes. I'm sure this situation will be fixed in the nearest future.

Upvotes: 2

Yucel Bayram
Yucel Bayram

Reputation: 1663

I believe, it cannot be done by Reality Kit, As i read documentation of face tracking, i could not find anything about tracking with Reality Kit. But you may use SceneKit and also SpriteKit. Please check this document.

https://developer.apple.com/documentation/arkit/tracking_and_visualizing_faces

This sentence also took my attention.

This sample uses ARSCNView to display 3D content with SceneKit, but you can also use SpriteKit or build your own renderer using Metal (see ARSKView and Displaying an AR Experience with Metal).

Upvotes: 0

Related Questions