Reputation: 63
I am using RealityKit
+ SwiftUI
+ ARSessionDelegate
to render 3D content on top of an ARReferenceObject
. I want to remove the 3D content once the camera pans away from the object and it is no longer in the frame.
Currently I render the 3D content when the object is detected, which is what I want. But I have multiple identical objects that I want to identify separately using the same ARReferenceObject
. So in order to do this I need to remove the original anchoring.
This is my wrapper for SWiftUI
:
struct ARViewWrapper: UIViewRepresentable {
@ObservedObject var arManager: ARManager
// cretae alias for our wrapper
typealias UIViewType = ARView
// delegate for view representable
func makeCoordinator() -> Coordinator {
return Coordinator(arManager: self.arManager)
}
func makeUIView(context: Context) -> ARView {
// create ARView
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
// assign delegate
arView.session.delegate = context.coordinator
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
print("Updating View")
// create anchor using an image and add it to the ARView
let target = AnchorEntity(.object(group: "AR Resources", name: "bj"))
target.name = "obj_anchor"
// add anchor to AR world
if(uiView.scene.anchors.count == 0){
uiView.scene.anchors.append(target)
}else{
uiView.scene.anchors[0] = target
}
// add plane and title to anchor
addARObjs(anchor: target, arObj: arManager.currARObj)
return()
}
}
This is my Delegate
:
class Coordinator: NSObject, ARSessionDelegate {
@ObservedObject var arManager: ARManager
init(arManager: ARManager) {
self.arManager = arManager
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
return
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]){
return
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
return
}
}
Upvotes: 3
Views: 1305
Reputation: 58093
You can do it in SceneKit. All you need is to use isNode(_:insideFrustumOf:) instance method that returns a Boolean value
indicating whether a node might be visible from a specified point of view or not. This method is also implemented in ARKit (as a part of SceneKit).
func isNode(_ node: SCNNode, insideFrustumOf pointOfView: SCNNode) -> Bool
Sample code:
var allYourNodes = [SCNNode]()
allYourNodes.append(node001)
allYourNodes.append(node002)
guard let pointOfView = arSCNView.pointOfView
else { return }
for yourNode in allYourNodes {
if !arView.isNode(yourNode, insideFrustumOf: pointOfView) {
arSCNView.session.remove(anchor: yourARAnchor)
}
}
However, I haven't found a similar method in RealityKit 2.0. Hope it'll be added by Cupertino engineers in the near future.
Here's what we have in RealityKit 2.0 at the moment:
Apple's documentation says: During an AR session, RealityKit automatically uses the device’s camera to define the perspective from which to render the scene. When rendering a scene outside of an AR session – with the view’s cameraMode property set to
ARView.CameraMode.nonAR
RealityKit uses a PerspectiveCamera
instead. You can add a perspective camera anywhere in your scene to control the point of view. If you don't explicitly provide one, RealityKit creates a default camera for you.
So, the only available parameters of a PerspectiveCameraComponent
at the moment are:
init(near: Float, far: Float, fieldOfViewInDegrees: Float)
Upvotes: 2