Mohammad Albardaweel
Mohammad Albardaweel

Reputation: 67

Augmented Reality for iOS and swift front facing camera applications

I am developing an app that needs to use the front facing camera of the iPhone for Augmented Reality experience using swift. I have tried to use the ARKit, but the front facing camera made by the ARKit is only supported for iPhone X.

So, which frameworks or libraries that I can use with swift to develop apps that has AR experience especially fro front facing camera, other than ARKit?

Upvotes: 3

Views: 2999

Answers (2)

Andy Jazz
Andy Jazz

Reputation: 58563

ARKit 6.0

TrueDepth front-facing camera of iPhone (models from X to 15) gives you Depth channel at frame rate of 60 fps and Image front-facing camera gives you RGB channels at 60 fps too. However, if you do not need ARFaceTracking in your app, you can use AVFoundation or Apple Vision frameworks instead. Nonetheless, using an iPhone with a TrueDepth camera makes developing FaceTracking apps much easier.

TrueDepth sensor's infrared emitter projects a pattern of 30,000 dots onto the user’s face. Those dots are then captured by a dedicated IR camera for analysis. Additionally, an ambient light sensor helps the system set a light level.

enter image description here

If you don't have a TrueDepth sensor in your gadget (iPhone SE, iPhone 6s, iPhone 7 and iPhone 8 don't have) you cannot use such features as Animoji, Face ID, or Depth Occlusion features.

In ARKit for iOS, a configuration that tracks the movement and expressions of the user’s face with the TrueDepth camera, runs an object of ARFaceTrackingConfiguration class. ARKit allows you simultaneously track a surrounding environment with back camera and track your face with the help of front camera – you can track up to 3 faces at a time.

Here's two scenarios of how to setup your AR config.


let configuration = ARWorldTrackingConfiguration()

if configuration.supportsUserFaceTracking {
    configuration.userFaceTrackingEnabled = true
}
session.run(configuration)

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // you code here...
    }
}

let configuration = ARFaceTrackingConfiguration()

if configuration.supportsWorldTracking {
    configuration.worldTrackingEnabled = true
}
session.run(configuration)

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    let transform = frame.camera.transform
    // you code here...
}

Upvotes: 4

rickster
rickster

Reputation: 126167

ARKit isn't the only way possible to create "AR" experiences on iOS, nor is it the only way that Apple permits creating "AR" in the App Store.

If you define "front-facing-camera AR" as something like "uses front camera, detects faces, allows placing virtual 2D/3D content overlays that appear to stay attached to the face", there are any number of technologies one could use. Apps like Snapchat have been doing this kind of "AR" since before ARKit existed, using technology they've either developed in-house or licensed from third parties. How you do it and how well it works depends on the technology you use. ARKit guarantees a certain precision of results by requiring a front-facing depth camera.

It's entirely possible to develop an app that uses ARKit for face tracking on TrueDepth devices and a different technology for other devices. For example, looking only at what you can do "out of the box" with Apple's SDK, there's the Vision framework, which locates and tracks faces in 2D. There's probably a few third party libraries out there, too... or you could go looking through academic journals, since face detection/tracking is a pretty active area of computer vision research.

Upvotes: 4

Related Questions