Reputation: 2122
Is it possible in ARKit or other similar library to detect distance away from an object within any sort of accuracy or precision?
What are the limits or tolerance to the accuracy or precision of that distance value if the size and range is in two different real-world scene categories:
For example, for 1 above: can I hold a coffee cup or a car key at arms length away and detect how far away it is within some precision? For 2 above: can I stand in front of a tree and detect how far away the tree is?
If distance away (Z dimension) is possible within a high degree of precision, does that mean its vertical height (Y dimension) can be measured or calculated within some proportional precision?
To test this, suggestions on the core code elements?
Upvotes: 1
Views: 1130
Reputation: 491
This is certainly possible with ARKit.
As ARKit runs it finds features in the environment. These are points which it can recognise repeatedly, and uses to determine the 'pose' of the camera.
These features points come with an [x, y, z]
position in real world space (relative to the camera). If you tap on one the screen ARKit can give you the 3D pose of the closest feature point that you tapped on.
To get the position of an object you would have to:
The following functions will be useful for you.
In viewDidLoad
let gestureRecognizer = UITapGestureRecognizer(target: self, action: #selector( self.handleSingleTap ))
self.sceneView.addGestureRecognizer(gestureRecognizer)
with:
@objc func handleSingleTap ( recogniser: UIGestureRecognizer) {
let touchPosition = recogniser.location(in: self.sceneView)
let featureHitTestResult = self.sceneView.hitTest(touchPosition, types: .featurePoint)
if !featureHitTestResult.isEmpty {
hitFeature = featureHitTestResult.first
}
}
You can then get the position from the hitFeature
and use it to determine the distance from the current camera position.
Upvotes: 1