Duncan Groenewald
Duncan Groenewald

Reputation: 8988

What is the correct way to get a RAW image size using Core Image on macOS

According to the Core Image documents the following API should return the RAW images native output size.

let filter = CIFilter(imageURL: url, options: nil)
let value = filter.value(forKey: CIRAWFilterOption.outputNativeSize.rawValue ) as? CIVector

However this seems to always return the camera's native resolution rather than the actual image size contained in the RAW file.

For example if I uses this API on a RAW file that was shot at 16:9 aspect ratio on a Sony 19 the image size should be 6000 x 3376 but this API call returns 6000 x 4000.

Is this a bug or am I missing something - is there another API call to get the actual image size ?

Note that the EXIF data does contain the correct image size.

Upvotes: 2

Views: 197

Answers (1)

Duncan Groenewald
Duncan Groenewald

Reputation: 8988

OK it seems that the camera stores a thumbnail image that is generated using the cameras aspect ratio setting but the RAW file still contains the full image i.e the cameras native sensor size.

When using the macOS QuickLook thumbnail generator you get back the camera generated thumbnail which is cropped to whatever aspect ration was set at the time.

However when using CIFilter to get the RAW CIImage you get the full uncropped image. If you query the image size using CIFilter you also get the full image size.

It seems the cropped image size is defined in the EXIF XDimension and YDimension properties.

Upvotes: 1

Related Questions