Honey
Honey

Reputation: 189

Why prefer Non-SRGB format for vulkan swapchain?

in the Vulkan cube example, the method to pick the surface format is:

vk::SurfaceFormatKHR Demo::pick_surface_format(const std::vector<vk::SurfaceFormatKHR> &surface_formats) {
// Prefer non-SRGB formats...
for (const auto &surface_format : surface_formats) {
    const vk::Format format = surface_format.format;

    if (format == vk::Format::eR8G8B8A8Unorm || format == vk::Format::eB8G8R8A8Unorm ||
        format == vk::Format::eA2B10G10R10UnormPack32 || format == vk::Format::eA2R10G10B10UnormPack32 ||
        format == vk::Format::eR16G16B16A16Sfloat) {
        return surface_format;
    }
}

printf("Can't find our preferred formats... Falling back to first exposed format. Rendering may be incorrect.\n");

assert(surface_formats.size() >= 1);
return surface_formats[0];

}

why would we prefer the non-SRGB formats, isn't most screens expecting the SRGB format?

Upvotes: 2

Views: 1317

Answers (1)

j00hi
j00hi

Reputation: 5931

The answer to your question is: You generally do not prefer non-sRGB formats for the swapchain images. Your doubts are totally justified: The code in question arguably does not show a best practice approach. Consider Figure 1, which shows the fundamental conversion stages involved when capturing images, storing them in a a low dynamic range (LDR) format, and displaying it through a computer monitor.

Gamma Correction Figure 1: Clearly, physical radiance is in linear space (leftmost stage). When we capture an image, we often want to store it in an LDR format (e.g., 8 bit or even less per color channel) to best utilize storage space ("LDR image" stage). Whatever radiance is output by a monitor must be again in linear space, otherwise we would have distorted the color values (rightmost stage). Since most (all?) monitors only support LDR framebuffer formats, they apply gamma correction to the framebuffer color values, before they emit the corresponding radiance (monitor stage).

I.e., your monitor expects gamma space as input.
Therefore, the proper approach is to send an sRGB format to your monitor.

In fact, checking gpuinfo.org and sorting the list by COLOR_ATTACHMENT support, one can observe that sRGB formats are the most commonly supported formats. Here are the top four formats support-wise:

  • B8G8R8A8_SRGB: 99.88%
  • A8B8G8R8_SRGB_PACK32: 99.72%
  • R8G8B8A8_SRGB: 99.48%
  • B8G8R8A8_UNORM: 99.44%

So, I would say, default to an sRGB format!

P.S.: In case you are wondering why it makes sense to use an LDR image format, have a look at Figure 2 for an explanation!

enter image description here Figure 2: The top row shows evenly spaced bars w.r.t. physical light intensity. This means, the amount of emitted photons increases by the same amount between each neighboring bar. The bottom row shows evenly spaced bars w.r.t. perceived light intensity---the increase of physical light intensity corresponds to the gamma curve shown in Figure 1. In other words, we only have to add a few photons in darker regions for a big change in perceived light intensity. For brighter regions, we have to add a lot of photons for the same relative difference in perceived light intensity. Therefore, it makes sense to store images so that relatively more bits are allocated for the darker colors, and fewer bits for the brighter colors in order to use the available bits optimally w.r.t. the human visual system.

Upvotes: 5

Related Questions