Reputation: 6282
I wanted to create a semi-transparent overlay for my screen and decided to dynamically create a custom Texture2D
object using the following code:
const int TEX_WIDTH = 640;
const int TEX_HEIGHT = 480
Texture2D redScreen;
void GenerateTextures(GraphicsDevice device)
{
redScreen = new Texture2D(device, TEX_WIDTH, TEX_HEIGHT);
uint[] red = new uint[TEX_WIDTH * TEX_HEIGHT];
for (int i = 0; i < TEX_WIDTH * TEX_HEIGHT; i++)
red[i] = 0x1A0000ff;
redScreen.SetData<uint>(red);
}
And it just doesn't seem to work as expected! Looking at this code, I would expect the alpha value to be about 10%. (0x1A / 0xFF = ~10)
but it ends up being much more than that.
It seems to me that the uint
represents an ARGB value, but the transparency value is never what I set it to be. it's either "somewhat transparent" or not transparent at all.
I don't like asking vague questions,
But what am I doing wrong?
what's wrong with this code snippet?
Edit:
In the end, I could only get the wanted results by setting BlendState.NonPremultiplied
in the spriteBatch.Begin()
call.
Upvotes: 2
Views: 3607
Reputation: 1936
XNA by default uses pre-multiplied alpha so you have to multiply all of the color values by the alpha value. Also there is a color struct that you might find convenient. So I suggest the below. Alpha should be between 0 and 1 inclusive.
const int TEX_WIDTH = 640;
const int TEX_HEIGHT = 480
Texture2D redScreen;
void GenerateTextures(GraphicsDevice device)
{
redScreen = new Texture2D(device, TEX_WIDTH, TEX_HEIGHT);
uint[] red = new uint[TEX_WIDTH * TEX_HEIGHT];
for (int i = 0; i < TEX_WIDTH * TEX_HEIGHT; i++)
red[i] = new Color(255, 0, 0) * Alpha;
redScreen.SetData<uint>(red);
}
Upvotes: 3
Reputation: 10896
I don't see you specifying a surface/pixel format. Are you sure each pixel is a uint?
To be sure, create a texture with a specified layout and then calculate the value in it for a given R, G, B and A.
Upvotes: 3