lichoniespi
lichoniespi

Reputation: 91

glTexImage2D causes memory leak

I planned to use OpenGL to render video stream.

First step i do after receiving first frame of the video is allocating direct byte buffer and putting all the frame fragments in it. The ByteBuffer is allocated only once.

directBuffer = ByteBuffer.allocateDirect(frameSize * fragmentCount);

When all frame fragments are in place, i'm passing the ByteBuffer to OpenGL renderer

public ByteBuffer getBuffer() {
    buffer.rewind();
    fragments.stream().forEach((frameFragment) -> {
        for (byte byteFragment : frameFragment.getFrameData()) {
            buffer.put(byteFragment);
        }
    });
    buffer.flip();
    return buffer;
}

The blocking queue in main scene loop is waiting for frame to be ready and then renders the scene.

ByteBuffer frame = framesQueue.take();

Afterwards im clearing the scene, setting viewport and so on

            glClear(GL_COLOR_BUFFER_BIT);
            glColor3f(1, 1, 1);
            glMatrixMode(GL_PROJECTION);
            glPushMatrix();
            glLoadIdentity();
            glOrtho(-480, 480, -270, 270, -1, 1);
            glPushMatrix();
            glViewport(0, 0, 768, 576);

When it is done, im ready to draw a textured quad onto the scene.

    glBindTexture(GL_TEXTURE_2D, glGenTextures());
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, 768, 576, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer);
    glBegin(GL_QUADS);
    {
        glTexCoord2f(0.0f, 0.0f);
        glVertex2f(0.0f, 0.0f);

        glTexCoord2f(1.0f, 0.0f);
        glVertex2f(768, 0.0f);

        glTexCoord2f(1.0f, 1.0f);
        glVertex2f(768, 576);

        glTexCoord2f(0.0f, 1.0f);
        glVertex2f(0.0f, 576);
    }
    glEnd();

The program is running, the video is pretty smooth and has reasonably low latency (that was the main concern)

The problem is that method

        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, 768, 576, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer);

Is causing memory to leak.

The java heapspace seems fineenter image description here

But the memory usage by java keeps growing infinitely.enter image description here

For a test, i did comment execution of

   glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, 768, 576, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer);

method, and the memory leak did not occur. Also i tried using drawPixels method, that also helped but i think that using textures is the way to go here, not deprecated drawPixels method.

How can i solve the memory leak problem? Alternatively, what are the other, efficient, ways to display new texture on scene every 40ms. Latency is critical.

Upvotes: 0

Views: 2013

Answers (1)

lichoniespi
lichoniespi

Reputation: 91

This call seemed to be a problem

glBindTexture(GL_TEXTURE_2D, glGenTextures());

Since im just using a single texture the call can be replaced with

glBindTexture(GL_TEXTURE_2D, 0);

That prevents OpenGL from creating NEW texture each call.

Upvotes: 3

Related Questions