Reputation: 19
Please don't tell me to use 24 bit texture or stuff like this because this is for a pet project of mine, and it need to support 1,4,8,16 and 32 bits textures format.
Now, all the format above work, even 1bpp, except for the 4bit one. I should also mention that the color used is 16 shades of gray. Also, the texture i use are all padded to 4 bytes.
So, for example, let say i want to load this 4x4 texture
BYTE Img[4][4] = {
{0xCD, 0xEF, 0x00, 0x00},
{0x89, 0xAB, 0x00, 0x00},
{0x45, 0x67, 0x00, 0x00},
{0x01, 0x23, 0x00, 0x00},
};
Each half-byte should represent a texel. The 0x00 are for padding purpose.
How on earth do i load such a texture???
Now, i tryed with
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE4, 4, 4, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, pTex);
But for some reason, it give the same result as using a 8 bit texture using GL_LUMINANCE, and i only see a 2x4 texture pattern on the screen, with the other half black.
I also tried with glPixelMapfv but i cant make it work, all i got is a blank texture.
I hope i've been clear enough, if not, just tell me and i'll make the correction.
Upvotes: 1
Views: 1121
Reputation: 162164
GL_LUMINANCE4
is the internal format, i.e. the format the texture data will be stored on the OpenGL side. However what you want is to specify a 4 bit per pixel external data format for unpacking the pixels. This is determined by the format and type parameters of glTexImage
. The format in your case is GL_LUMINANCE
(in later versions of OpenGL this would be GL_RED
just FYI) well and type, I'm sorry to have to tell you, that there's no 4 bit per pixel unpacking type available in OpenGL. There might be a vendor specific extension though.
Your best course of action would be to unpack into a 8 bits per pixel format yourself and then upload this. This is not very difficult to implement.
Upvotes: 3