Ulrik
Ulrik

Reputation:

SDL_surface to OpenGL texture

Hey, I have this script to load a SDL_Surface and save it as a OpenGL texture:

typedef GLuint texture;

texture load_texture(std::string fname){
    SDL_Surface *tex_surf = IMG_Load(fname.c_str());
    if(!tex_surf){
        return 0;
    }
    texture ret;
    glGenTextures(1, &ret);
    glBindTexture(GL_TEXTURE_2D, ret);
    glTexImage2D(GL_TEXTURE_2D, 0, 3, tex_surf->w, tex_surf->h, 0, GL_RGB, GL_UNSIGNED_BYTE, tex_surf->pixels);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    SDL_FreeSurface(tex_surf);
    return ret;
}

The problem is that it isn't working. When I call the function from the main function, it just doesn't load any image (when displaying it's just turning the drawing color), and when calling from any function outside the main function, the program crashes. It's this line that makes the program crash:

2D(GL_TEXTURE_2D, 0, 3, tex_surf->w, tex_surf->h, 0, GL_RGB, GL_UNSIGNED_BYTE, tex_surf->pixels);

Can anybody see a mistake in this?

Upvotes: 2

Views: 10517

Answers (4)

Marcello Herreshoff
Marcello Herreshoff

Reputation: 11

Some older hardware (and, surprisingly, emscripten's opengl ES 2.0 emulation, running on the new machine I bought this year) doesn't seem to support textures whose dimensions aren't powers of two. That turned out to be the problem I was stuck on for a while (I was getting a black rectangle rather than the sprite I wanted). So it's possible the poster's problem would go away after resizing the image to have dimensions that are powers of two.

See: https://www.khronos.org/opengl/wiki/NPOT_Texture

Upvotes: 0

Peter Holzer
Peter Holzer

Reputation: 173

The problem lies probably in 3rd argument (internalformat) of the call to glTexImage2D.

glTexImage2D(GL_TEXTURE_2D, 0, 3, tex_surf->w, tex_surf->h, 0, GL_RGB, GL_UNSIGNED_BYTE, tex_surf->pixels);

You have to use constants like GL_RGB or GL_RGBA because the actual values of the macro are not related to the number of color components.

A list of allowed values is in the reference manual: https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glTexImage2D.xhtml .

This seems to be a frequent mistake. Maybe some drivers are just clever and correct this, so the wrong line might still work for some people.

/usr/include/GL/gl.h:473:#define GL_RGB             0x1907
/usr/include/GL/gl.h:474:#define GL_RGBA            0x1908

Upvotes: 1

ParoXoN
ParoXoN

Reputation: 582

I'm not sure if you're doing this somewhere outside your code snippet, but have you called

glEnable(GL_TEXTURE_2D);

at some point?

Upvotes: 0

Zack The Human
Zack The Human

Reputation: 8491

My bet is you need to convert the SDL_Surface before trying to cram it into an OpenGL texture. Here's something that should give you the general idea:

SDL_Surface* originalSurface; // Load like an other SDL_Surface

int w = pow(2, ceil( log(originalSurface->w)/log(2) ) ); // Round up to the nearest power of two

SDL_Surface* newSurface = 
  SDL_CreateRGBSurface(0, w, w, 24, 0xff000000, 0x00ff0000, 0x0000ff00, 0);
SDL_BlitSurface(originalSurface, 0, newSurface, 0); // Blit onto a purely RGB Surface

texture ret;

glGenTextures( 1, &ret );
glBindTexture( GL_TEXTURE_2D, ret );
glTexImage2D( GL_TEXTURE_2D, 0, 3, w, w, 0, GL_RGB,
          GL_UNSIGNED_BYTE, newSurface->pixels );

I found the original code here. There may be some other useful posts on GameDev as well.

Upvotes: 2

Related Questions