Reputation: 1074
I'm developing an AR Drawing app, and I encountered a bug/problem in AR Kit 3. World tracking stops working and the 3D objects added in the AR view are frozen.
My app switches between front and back camera, in both cases I'm enabling World Tracking.
1. I enable back camera session like this:
let configuration = ARWorldTrackingConfiguration()
configuration.isLightEstimationEnabled = true
configuration.planeDetection = [.horizontal, .vertical]
if #available(iOS 13.0, *) {
configuration.frameSemantics = [.personSegmentationWithDepth]
}
sceneView.session.run(configuration)
Everything works perfect if I just run this configuration on the scene session.
2. When I switch to the front camera like this:
let configuration = ARFaceTrackingConfiguration()
if #available(iOS 13.0, *) {
configuration.isWorldTrackingEnabled = true
}
configuration.isLightEstimationEnabled = true
if #available(iOS 13.0, *) {
configuration.frameSemantics = [.personSegmentation]
}
sceneView.session.run(configuration)
Everything still working perfect.
3. But when I switch back to back camera session (I switch back like point 1.):
The 3D objects added in the AR Scene become frozen in a static view and the world track stops working. And there's no way to getting it back to work, but closing the app and re-opening.
The funny things:
configuration.isWorldTrackingEnabled = true
part. The bug does not appear. But I do need the
configuration.isWorldTrackingEnabled = true
being set.Here's a video (no bug appears) of the app with configuration.isWorldTrackingEnabled =
false
:
No bug video - https://www.youtube.com/watch?v=JPAa6zJe_kQ
And here's a video (bug appears) of the app with configuration.isWorldTrackingEnabled =
true
:
Yes bug video - https://www.youtube.com/watch?v=UF2Z8c4A42I
What have I tried already?
ARFaceTrackingConfiguration()
with configuration.isWorldTrackingEnabled = true
and then re-running it again with configuration.isWorldTrackingEnabled = false
to see if it would override anything and fix it. But no lock, still breaking.sceneView.session.run(configuration,options: [.resetTracking,.removeExistingAnchors,.stopTrackedRaycasts])
, but still no luck.Anyone has an idea on how to fix it? Anyone has encountered this weird behaviour?
Upvotes: 2
Views: 842
Reputation: 2913
I am of the belief that switching between the front and rear camera (the ARWorldTrackingConfiguration and ARFaceTrackingConfiguration, in this case), is not possible as it is in a traditional camera session. ARKit is going to reset all of your added objects and anchors each time you do this.
I would suggest looking into Apple's "Combining User Face-Tracking and World Tracking" sample project, which does contain detail of how to use face tracking with the rear camera. That makes the assumption that you only need to track a face and not actually show the front-facing camera, which may not be prudent for you.
You could, hypothetically, consider using your own Metal renderer for the ARKit session, which could then take advantage of AVFoundation to provide manual control of the camera (and camera switching), but you would be responsible for determining 3D placement of objects and rendering those on-screen/in a 3D space.
Additionally, it may be worth trying to set userFaceTrackingEnabled
to true when configuring your ARWorldTrackingConfiguration
.
In short, you're not really switching between the camera as you think, you are switching configurations that happen to use different cameras. iOS/iPadOS 13 added the ability to implement face tracking on a world tracking configuration and world tracking on a face tracking configuration, but for the purpose of actively running both cameras at the same time, but only show the camera preview of one of those cameras to a user during the session (subsequently, people occlusion does not actually do anything in an ARFaceTrackingConfiguration anyway; it only works on the rear camera - you'd need to use the AVDepthData to perform a similar effect).
Upvotes: 1