Reputation: 99
I'm trying to use RealityKit and SwiftUI to place an object of a specified size in AR. However, the object is displayed a little smaller than the size specified in the code. How can I get the object to display at the correct size?
Here is the current code:
func addObject() {
let mesh = MeshResource.generateBox(width: Float(1.0),
height: Float(1.0),
depth: Float(1.0))
let material = SimpleMaterial(color: .gray,
roughness: 0.5,
isMetallic: true)
let modelEntity = ModelEntity(mesh: mesh,
materials: [material])
let anchorEntity = AnchorEntity(plane: .horizontal)
modelEntity.generateCollisionShapes(recursive: true)
self.installGestures([.rotation, .translation] ,for: modelEntity)
anchorEntity.name = "BoxAnchor"
anchorEntity.addChild(modelEntity)
self.scene.addAnchor(anchorEntity)
}
I have confirmed that I am able to detect the plane correctly using ARCoachingOverlayView and FocusEntity.
Any advices are helpful. Thanks.
Upvotes: 1
Views: 725
Reputation: 58553
Type methods .generateBox(size: 1)
and .generateBox(width: 1, height: 1, depth: 1)
allow you create a true one-meter-box. Even if your tracking is good, sometimes it seems to you that this cube is smaller than 1m X 1m X 1m
because the objects' occlusion option is disabled - in other words, this cube "climbs" onto real world objects closest to you, and it seems that it is smaller and closer than it really is. That's because if there's no occlusion, your model will be always composited OVER real-world objects on video.
Sometimes, (due to bad tracking) your model "flies away" from a place where it must be located.
In both cases turning occlusion on is a good idea. To find out how to enable object occlusion based on Z-channel read this post.
P.S.
Also follow these recommendations.
Upvotes: 1