Reputation:
I have the following code in my font rendering test:
SDL_Surface* image = SDL_CreateRGBSurface(SDL_SWSURFACE,
face->glyph->bitmap.width,
face->glyph->bitmap.rows,
8,
0, 0, 0,
0);
if(!image)
{
throw std::runtime_error("Failed to generate 8 bit image");
}
//There's no better way to do this, right? Being it just sets up a basic grayscale palette
SDL_Color colors[256];
for(int i=0; i<256; ++i)
{
colors[i].r = i;
colors[i].g = i;
colors[i].b = i;
}
SDL_SetColors(image, colors, 0, 256);
SDL_LockSurface(image);
uint8_t* pixels = static_cast<uint8_t*>(image->pixels);
for(int i = 0; i<face->glyph->bitmap.rows; ++i)
{
for(int j = 0; j<face->glyph->bitmap.width; ++j)
{
pixels[i * face->glyph->bitmap.width + j] = face->glyph->bitmap.buffer[
i * face->glyph->bitmap.width + j];
}
}
###############################################################################
# Here's the spotlight #
# | #
# _________________/ #
# / #
###############################################################################
image->pitch = image->w;
SDL_UnlockSurface(image);
Notice I change the pitch to the width, effectively making the byte-per-pixel equal to one. This is exactly what I need, in fact, changing the pitch like I did is the deciding factor in whether my letters come out crooked or not.
My question is: Is this defined, safe, and good practice? Is there a function to help me out that I don't know about? Is there a safer alternative?
Upvotes: 0
Views: 397
Reputation: 71
You are wrong to write image->pitch. It might work in some situations, but cause corrupted display or segfault in other situations. It all depends where that chunk of memory came from.
I think your calculation of the index into pixels[] is wrong, it should be:
pixels[i * image->pitch + j] = ...
Upvotes: 2