Reputation: 89
In RealityKit, similiar to ARKit, objects won't show until the camera has detected some sort of flat surface. Once the camera has detected that surface, the objects will show and pin to it.
How do I know (by code) if the camera has detected a flat surface? Actually, I want to highlight the selectable area but I'm not sure if RealityKit actually allows you to do so, I know SceneKit does tho.
Upvotes: 6
Views: 3603
Reputation: 58053
There's a plane
initialiser (and enumeration case) in RealityKit for this purpose:
convenience init(plane alignment: AnchoringComponent.Target.Alignment,
classification: AnchoringComponent.Target.Classification,
minimumBounds: SIMD2<Float>)
/* Where `minimumBounds` is the minimum size of the target plane */
It's a counterpart of ARKit's ARPlaneAnchor
with extent
property (that's an estimated width and length of the detected plane). But in RealityKit it works a little bit different.
In a real code you may use it this way:
let anchor = AnchorEntity(.plane([.horizontal, .vertical],
classification: [.wall, .table, .floor],
minimumBounds: [0.375, 0.375]))
/* Here we create an anchor for detected planes with a minimum area of 37.5 cm2 */
anchor.addChild(semiTranparentPlaneEntity) // visualising a detected plane
arView.scene.anchors.append(anchor)
Pay attention that alignment
and classification
arguments conform to OptionSet protocol.
And you can always find out whether plane anchor has created or not:
let arView = ARView(frame: .zero)
let anchor = AnchorEntity(.plane(.any, classification: .any,
minimumBounds: [0.5, 0.5]))
anchor.name = "PlaneAnchor"
let containsOrNot = arView.scene.anchors.contains(where: {
$0.name == "PlaneAnchor"
})
print(containsOrNot)
print(arView.scene.anchors.count)
print(arView.scene.anchors.first?.anchor?.id)
Upvotes: 6