xta
xta

Reputation: 809

How to measure the dimensions of a 3d object using ARKit or Apple Vision?

Using the iPhone camera (and presumably some combination of ARKit, Apple Vision, CoreML/mlmodels, etc), how would you measure the dimensions (width, height, depth) of an object? The object being something small that sits on a desk

Using mlmodel, you can train ML to perform object detection of specific objects. That would only allow you to draw a box around your detected object on the 2d screen.

I want to be able to use the phone camera to look at and perhaps move around the object to determine the dimensions/actual size of the object.

I've read about edge detection or shape detection, but I don't think I need image to image Holistically-Nested Edge Detection.

ARKit excels at using the phone's hardware to measure small scale distances accurately enough.

One potential method would be to have a known-size reference object (like a quarter) next to the object to compare, but that would introduce complications & hassles.

Ideally, I'd like to point the iPhone camera at the small object on the desk, maybe look around (rotate around the object a bit) and have a ballpark set of measurements of the object size & have ARAnchor(s) for the object's actual location.

Upvotes: 1

Views: 3165

Answers (1)

MoD
MoD

Reputation: 634

Let the user make an ARHittest by tap on the corners of your object. There you get a nodes position in the ARWorld. From there you can calculate the specific length of the object etc...

let newPoint = CGPoint(*e.g. the point the user taps*)
let hitTestResults = sceneView.hitTest(newPoint, types: .estimatedHorizontalPlane)
guard let hitTestResult = hitTestResults.first else { return }
 print("Your Nodes position: (htr.worldTransform.translation)")

Then calculate the distance between that points.

Also you can make this ARHitTest without a tap by the user. Then detect the Object via Vision and make in the right moment the ARHitTest.

Upvotes: 2

Related Questions