Reputation: 30125
Im starting to add d3d10 support to go along with my existing d3d9 graphics backend.
The problem is all the existing code (in several applications...) uses ARGB formatted colours however I couldnt find a format mode that matches for d3d10. Does d3d10 not support ARGB colour formats at all or have I just missed something? If I havnt missed something what is a good way to convert between them, It just requires the first byte be moved to the end, this seems like a pretty simple concept however I cant see anyway to do it other than breaking the colour into its components and reconstructing it...eg:
//unsigned colIn, colOut
unsigned char
a = (colIn & 0xFF000000) >> 24,
r = (colIn & 0x00FF0000) >> 16,
g = (colIn & 0x0000FF00) >> 8,
b = (colIn & 0x000000FF);
colOut = (r << 24) | (g << 16) | (b << 8) | a;
Upvotes: 2
Views: 970
Reputation: 21
It looks to me like that format has gone missing so you would need to re-order your data. However if you're going to use RGBA, you wouldn't need to separate all your colors like your example does since R,G,&B are still in the same order consecutively, you could move those 3 channels as one chunk.
Upvotes: 2
Reputation: 399793
Looking at the relevant enum type, I (too) fail to find any AxRxGxBx formats. So it seems you need to do the swizzling by yourself, then.
This is extremely suitable for SSE optimization of course, check if your compiler is able to optimize the code into something that uses SSE and performance should be fine.
Think about endianness issues when doing this code, it's easy to make a mistake and this code is hard to write without caring about endianness.
Upvotes: 1