Reputation: 1429
Apple has realised ARKit beta 1.5 with some additional features. (vertical plane detection, image detection, ...)
I am working on image detection and I would like to know how can we get orientation information when detecting images? (vertical / horizontal image detection)
The only way to get this information is with ARPlaneAnchor.Alignment
.
On Apple sample project, they assume the image is horizontal in its local space.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
updateQueue.async {
// Create a plane to visualize the initial position of the detected image.
let plane = SCNPlane(width: referenceImage.physicalSize.width,
height: referenceImage.physicalSize.height)
let planeNode = SCNNode(geometry: plane)
planeNode.opacity = 0.25
/*
`SCNPlane` is vertically oriented in its local coordinate space, but
`ARImageAnchor` assumes the image is horizontal in its local space, so
rotate the plane to match.
*/
planeNode.eulerAngles.x = -.pi / 2
/*
Image anchors are not tracked after initial detection, so create an
animation that limits the duration for which the plane visualization appears.
*/
planeNode.runAction(self.imageHighlightAction)
// Add the plane visualization to the scene.
node.addChildNode(planeNode)
}
}
Upvotes: 0
Views: 1378
Reputation: 157
In ARKit apps, that uses both vertical and horizontal planes detection, we can get plane orientation by using property ARPlaneAnchor.Alignment.
Upvotes: 0
Reputation: 7385
I am not entirely sure if this is correct, but hopefully this will help you or at least get you started.
I think you would need to get look at getting data from the camera to help with this. For example:
func renderer(_ renderer: SCNSceneRenderer, didRenderScene scene: SCNScene, atTime time: TimeInterval) {
/*
Get The Pitch (Rotation Around X Axis)
Get The Yaw (Rotation Around Y Axis)
Get The Roll (Rotation Around Z Axis)
*/
guard let pitch = self.augmentedRealityView.session.currentFrame?.camera.eulerAngles.x,
let yaw = self.augmentedRealityView.session.currentFrame?.camera.eulerAngles.y,
let roll = self.augmentedRealityView.session.currentFrame?.camera.eulerAngles.z else { return }
print("""
Pitch = \(degreesFrom(pitch))
Yaw = \(degreesFrom(yaw))
Roll = \(degreesFrom(roll))
""")
}
/// Convert Radians To Degrees
///
/// - Parameter radian: Float
/// - Returns: Float
func degreesFrom( _ radian: Float) -> Float{
return radian * Float(180.0 / Double.pi)
}
Then you can determine (approximately) whether the device is in a horizontal or vertical position.
Upvotes: 2