martin
martin

Reputation: 103

How to get "light-invariant" `capturedImage` in ARKit?

I want to stitch multiple background images provided by ARKit (ARFrame.capturedImage). (I know there are better ways to do this task, but I am using my custom algorithm.)

The issue is that the live stream does not have locked exposure and thus the color of an object in the scene depends on how I orient my iPhone. This, for example, leads to a wall having very different color in each frame (from white through gray to brown-ish), which creates visible banding when stitching the images together.

I noticed ARKit provides lightEstimate for each ARFrame with the ambientIntensity and ambientColorTemperature properties. There is also the ARFrame.camera.exposureOffset property.

Can these properties be used to "normalize" captured images so that colors of the objects in the scene stay roughly the same throughout time and I don't end up with severe banding?

P.S. I do need to use ARKit, otherwise I would set-up my own session based on the AVFoundation API with my own settings (e.g. locked exposure).

Upvotes: 4

Views: 356

Answers (1)

Andy Jazz
Andy Jazz

Reputation: 58113

Since all mentioned properties are not settable you can't use them directly to fix an intensity of every stitched image in panorama-360.

But you can calculate a difference of intensity and exposure of each frame and then use that multipliers for CoreImage filters. For instance, exposure difference is as simple as that:

Frame_02_Exposure / Frame_01_Exposure = 0.37

Then use the result as input multiplier for CIExposureAdjust filter.

Upvotes: 1

Related Questions