P.M.
P.M.

Reputation: 73

ARAnchor for SCNNode

I'm trying to get a hold of the anchor after adding an SCNNode to the scene of an ARSCNView. My understanding is that the anchor should be created automatically, but I can't seem to retrieve it.

Below is how I add it. The node is saved in a variable called testNode.

let node = SCNNode()
node.geometry = SCNBox(width: 0.5, height: 0.1, length: 0.3, chamferRadius: 0)
node.geometry?.firstMaterial?.diffuse.contents = UIColor.green

sceneView.scene.rootNode.addChildNode(node)

testNode = node

Here is how I try to retrieve it. It always prints nil.

if let testNode = testNode {
    print(sceneView.anchor(for: testNode))
}

Does it not create the anchor? If it does: is there another method I can use to retrieve it?

Upvotes: 3

Views: 5247

Answers (1)

PongBongoSaurus
PongBongoSaurus

Reputation: 7385

If you have a look at the Apple Docs it states that:

To track the positions and orientations of real or virtual objects relative to the camera, create anchor objects and use the add(anchor:) method to add them to your AR session.

As such, I think that since your aren't using PlaneDetection you would need to create an ARAnchor manually if it is needed:

Whenever you place a virtual object, always add an ARAnchor representing its position and orientation to the ARSession. After moving a virtual object, remove the anchor at the old position and create a new anchor at the new position. Adding an anchor tells ARKit that a position is important, improving world tracking quality in that area and helping virtual objects appear to stay in place relative to real-world surfaces.

You can read more about this in the following thread What's the difference between using ARAnchor to insert a node and directly insert a node?

Anyway, in order to get you started I began by creating an SCNNode called currentNode:

var currentNode: SCNNode?

Then using a UITapGestureRecognizer I created an ARAnchor manually at the touchLocation:

@objc func handleTap(_ gesture: UITapGestureRecognizer){

        //1. Get The Current Touch Location
        let currentTouchLocation = gesture.location(in: self.augmentedRealityView)

        //2. If We Have Hit A Feature Point Get The Result
        if let hitTest = augmentedRealityView.hitTest(currentTouchLocation, types: [.featurePoint]).last {

            //2. Create An Anchore At The World Transform
            let anchor = ARAnchor(transform: hitTest.worldTransform)

            //3. Add It To The Scene
            augmentedRealitySession.add(anchor: anchor)


        }

 }

Having added the anchor, I then used the ARSCNViewDelegate callback to create the currentNode like so:

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        if currentNode == nil{
            currentNode = SCNNode()
            let nodeGeometry = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0)
            nodeGeometry.firstMaterial?.diffuse.contents = UIColor.cyan
            currentNode?.geometry = nodeGeometry
            currentNode?.position = SCNVector3(anchor.transform.columns.3.x, anchor.transform.columns.3.y, anchor.transform.columns.3.z)
            node.addChildNode(currentNode!)

        }
 }

In order to test that it works, e.g being able to log the corresponding ARAnchor, I changed the tapGesture method to include this at the end:

if let anchorHitTest = augmentedRealityView.hitTest(currentTouchLocation, options: nil).first,{
            print(augmentedRealityView.anchor(for: anchorHitTest.node))
}

Which in my ConsoleLog prints:

Optional(<ARAnchor: 0x1c0535680 identifier="23CFF447-68E9-451D-A64D-17C972EB5F4B" transform=<translation=(-0.006610 -0.095542 -0.357221) rotation=(-0.00° 0.00° 0.00°)>>)

Hope it helps...

Upvotes: 7

Related Questions