nico_dkd
nico_dkd

Reputation: 169

How to display a face in world tracking

I'm playing around with the new ARKit3 and especially with the simultaneously world and face tracking.

I couldn't find a good tutorial or example.

I don't get how to start.

My goal would be, I could display a face in the world tracking that is showing my facial expressions(that are recorded from the front camera).

Really hoping someone can help me.

//That's my setup for the configuration
private func setupFaceTracking() {

    guard ARFaceTrackingConfiguration.isSupported else { return }
    let configuration = ARWorldTrackingConfiguration()
    configuration.isLightEstimationEnabled = true
    configuration.userFaceTrackingEnabled = true
    arView.session.run(configuration, options: [])
}

As a reference, this is what I'm trying to do

Upvotes: 0

Views: 1167

Answers (1)

beyowulf
beyowulf

Reputation: 15331

You're correct so far. You need to set up an ARWorldTrackingConfiguration with userFaceTrackingEnabled, you'll likely want to also have some kind of plane detection to anchor any face modified node into the scene as well. If you're using the ARKit Xcode template you can say something like:

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)

    // Create a session configuration
    let configuration = ARWorldTrackingConfiguration()
    configuration.userFaceTrackingEnabled = true
    configuration.isLightEstimationEnabled = true
    configuration.planeDetection = [.horizontal]

    // Run the view's session
    sceneView.session.run(configuration)
}

To get the face mesh you should use ARSCNFaceGeometry which can be instantiated using the metal device of your ARSCNView and stored as a property on your view controller. For example:

lazy var faceGeometry: ARSCNFaceGeometry = {
    let device = sceneView.device!
    let maskGeometry = ARSCNFaceGeometry(device: device)!
    maskGeometry.firstMaterial?.diffuse.contents = UIColor.white
    return maskGeometry
}()

Now it's a matter of getting the face geometry into the scene and responding to changes in the face.

To get the geometry in the scene I would recommend using a tap gesture that will place the node on the tapped plane. For example:

lazy var tapGesture: UITapGestureRecognizer = {
    let gesture = UITapGestureRecognizer(target: self, action: #selector(didTap(_:)))
    return gesture
}()

Add it to your view in viewDidLoad like sceneView.addGestureRecognizer(tapGesture). Then:

@objc func didTap(_ recognizer: UITapGestureRecognizer) {
    let tapLocation = recognizer.location(in: sceneView)
    let hitTestResults = sceneView.hitTest(tapLocation, types: .existingPlaneUsingExtent)

    guard let hitTestResult = hitTestResults.first, hitTestResult.anchor is ARPlaneAnchor else { return }
    // create anchor and add to session and wait for callback
    let newAnchor = ARAnchor(transform: hitTestResult.worldTransform)
    sceneView.session.add(anchor: newAnchor)
}

This will add an ARAnchor at the tapped position. With the anchor added nodeForAnchor will be called and we can vend a node that contains our face geometry.

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {

    // Make sure it's not an `ARPlaneAnchor`
    guard !(anchor is ARPlaneAnchor) else { return SCNNode() }

    // create empty node
    let node = SCNNode()

    // Add the stored face geometry as the node's geometry
    node.geometry = faceGeometry

    // Move node up to just above plane
    node.position = SCNVector3(0.0, 0.15, 0.0)

    // Create light so full topology is visible
    // You could also just set `sceneView.autoenablesDefaultLighting = true` to not have to deal with lighting
    let omni = SCNLight()
    omni.type = .omni
    omni.intensity = 3000
    omni.color = UIColor.white
    let omniNode = SCNNode()
    omniNode.light = omni
    omniNode.position = SCNVector3(0, 1, 0.5)

    // Create node to contain face and light
    let parentNode = SCNNode()
    parentNode.addChildNode(node)
    parentNode.addChildNode(omniNode)

    // Return parent node
    return parentNode
}

Now we can get the face mask into the scene, it is only a matter of responding to updates. For that we use didUpdateNode.

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    // If `ARFaceAnchor` update geometry
    if let faceAnchor = anchor as? ARFaceAnchor {
        faceGeometry.update(from: faceAnchor.geometry)
    } 
    // If `ARPlaneAnchor` update plane geometry and color plane
    else if let anchor = anchor as? ARPlaneAnchor,
        let device = sceneView.device {
        let plane = ARSCNPlaneGeometry(device: device)
        plane?.update(from: anchor.geometry)
        node.geometry = plane
        // For debug, add a color to planes
        node.geometry?.firstMaterial?.diffuse.contents = UIColor.blue.withAlphaComponent(0.8)
    }
}

If you do all of that you should get something like:

enter image description here

You'll want to make sure that device has a clear view of your face. If mask stops responding or node doesn't appear on tap try to hold device in a different position, typically either closer or further from your face.

Upvotes: 2

Related Questions