Reputation: 1079
so, there are many different color spaces that go by the name RGB. some are linear, and some are gamma. and i'm assuming that which one is being used varies from monitor to monitor.
which one should i interpret the rgb values in a directx or opengl shader as?
is there a 'safe' color space that i can assume a typical user will be using? like, is there a RGB space that everyone in the gaming industry assumes their users will be using?
or is there a way to detect what color space the monitor is using?
i need to know so that i can convert to and from standard color spaces like CIE XYZ. it's fine if it's a little wrong, but i'd like to be as close as possible for typical hardware.
Upvotes: 2
Views: 1171
Reputation: 162164
Without further provision the colors going into OpenGL go right through to the monitor, with just the video gamma ramp applied. All color operations in OpenGL are linear.
However OpenGL does provide special sRGB texture formats which linearize textures colors from colorspace first before doing color operations. To complement this there are also sRGB framebuffer formats supported.
Also see these for further info http://www.g-truc.net/post-0263.html and http://www.arcsynthesis.org/gltut/Texturing/Tut16%20Free%20Gamma%20Correction.html
Of course for some applications sRGB is a unfit color space (to small, to low fidelity at 8 bit). In those situations I'd recommend "raw color" OpenGL image and framebuffer formats and perform the color transformations in the shader. Use some contact color space (XYZ is very fit for this) with some HDR image format (i.e. 10 or more bits per channel) for textures (assuming R=X, G=Y, B=Z). Render to a off-screen framebuffer object and in a final post processing step transform from the XYZ framebuffer to the screen color space.
or is there a way to detect what color space the monitor is using?
OpenGL doesn't deal with device management. You'll have to use the OS's facilities to determine the output device's color profile.
Upvotes: 3