Reputation: 587
I'm loading 32 bit RGBA normalmap textures, with a heightmap encoded in the alpha channel, via SDL2 2.0.7 and SDL2_image 2.0.2 on OS X Sierra.
Every pixel in these textures has a non-zero RGB value, encoding a directional normal vector. A directional vector of (0, 0, 0) (i.e. black) is invalid.
And yet, when I load such a texture via SDL2_image, the areas of the texture with an alpha value of 0 also yield RGB values of 0. I think SDL is perhaps pre-multiplying the alpha value for these pixels?
Attached is one of these normalmap textures. You can confirm it is valid by opening the texture in e.g. GIMP and using the color picker on one of the transparent areas. You'll see that, indeed, the transparent areas still have an RGB color that is blue-ish (an encoded normal vector).
And below is a minimal test case illustrating the issue for the attached PNG file:
#include <SDL_image.h>
#include <assert.h>
int main(int argc, char **argv) {
SDL_Surface *s = IMG_Load("green3_2_nm.png");
assert(s);
for (int i = 0; i < s->w * s->h; i++) {
const Uint32 *in = (Uint32 *) (s->pixels + i * s->format->BytesPerPixel);
SDL_Color color;
SDL_GetRGBA(*in, s->format, &color.r, &color.g, &color.b, &color.a);
assert(color.r || color.g || color.b);
}
SDL_FreeSurface(s);
return 0;
}
I'm compiling this test case with gcc $(pkg-config --cflags --libs sdl2_image) test.c
The assertion on line 15 will fail several rows into the image -- i.e. exactly where the alpha value drops to 0.
I have tried both TGA and PNG image formats, but SDL does the same thing to both of them.
Is this a bug in SDL, or am I missing something? I'm curious if folks see this same issue on other platforms as well.
===
Answer: Core Graphics, the default image loading backend for SDL2_image on Apple OS X, does indeed pre-multiply alpha -- always. The solution is to recompile SDL2_image without Core Graphics support, and instead enable libpng, libjpeg, and any other image codecs you require:
./configure \
--disable-imageio \
--disable-png-shared \
--disable-tif-shared \
--disable-jpg-shared \
--disable-webp-shared
On my system, I had to disable Core Graphics (imageio
) and also the shared library loading of the other codecs, as you can see. This produced a fat SDL2_image.so
that was statically linked against libpng, libjpg, etc.. but worked as expected.
Upvotes: 3
Views: 577
Reputation: 213688
SDL_image is a wrapper around platform-specific image loading code, rather than using the same image loader on all platforms.
This has the advantage of reducing the size of SDL_image, since it doesn't have to ship with any image decoding code, and can instead link dynamically against something that's likely installed on your system already. However, on macOS and iOS, Core Graphics does not support non-premultiplied alpha, so SDL_image has to reverse it.
See: mac-opengl - Re: kCGImageAlphaFirst not implemented (Was: (no subject)) from May 2007 (from the Wayback machine):
Honestly, I wouldn't expect
CGBitmapContextCreate()
to support non- premultiplied alpha any time soon....
I'm not sure if using ImageIO + CoreGraphics was ever really targetted at being used for an image loading scheme for OpenGL applications.
This behavior was discovered in LibSDL bug #838 - OSX SDL darkens colours proportional to increase in alpha and a workaround was introduced in SDL_image changeset 240.
You can see that the workaround merely un-premultiplies the alpha, this is a horribly lossy process.
To address this, you could build your own version of SDL_image on macOS that uses LibPNG. This should be possible just through configuration, you should not have to make any changes to the SDL_image code itself. To do this, use the --disable-imageio
option. SDL_image ships with its own copy of the LibPNG code, so you should not need to install LibPNG in order to get this to work.
Upvotes: 5