stanley76726
stanley76726

Reputation: 1

A grayscale gradient image on an HDR10 monitor seems "whiter" or lighter than that in SDR mode

I'm trying to display 10-bit grayscale images on an HDR10 monitor.

A Windows app was implemented by DirectXTK: Using HDR rendering (which is based on Direct3D 11). For the purpose of comparison between HDR and SDR, I also duplicated the same app but disabled HDR.

I did a test to display a grayscale gradient image with 26 floors, but found that the middle-high floors in HDR app were "whiter" (or lighter) than that in SDR app: Grayscale gradient: HDR vs. SDR. This would make my real images become blurred in some case if pixel values in a region range in those floors.

I was expecting the middle floor (12th or 13th floor) should be nearly gray in both HDR and SDR apps, but HDR wasn't in my test. Similar result can be also seen from Microsoft D3D12HDR sample. Is my concept of HDR rendering wrong?

Upvotes: 0

Views: 280

Answers (0)

Related Questions