Spamdark
Spamdark

Reputation: 1391

FFmpeg memory leak

I have developed just a simple library modifing a library that I found on the internet.

What scares me, is that, when I play an avi, it plays and free the memory when the video ends, but when I play the video, it's like a memory leak! It grows to 138mb although the video has ended and the FreeAll method (A function that deletes the context, etc...) has been called.

Here is the code of the method that is causing the memory leak:

int VideoGL::NextVideoFrame(){
int frameDone = 0;
int result = 0;
double pts = 0;

if(!this->ended){

if (!_started) return 0;
AVPacket* packet;

// Get the number of milliseconds passed and see if we should display a new frame
int64_t msPassed = (1000 * (clock() - _baseTime)) / CLOCKS_PER_SEC;
if (msPassed >= _currentPts)
{
    // If this is not the current frame, copy it to the buffer
    if (_currentFramePts != _currentPts){
        _currentFramePts = _currentPts;
        memcpy(buffer_a,buffer, 3 * _codec_context_video->width * _codec_context_video->height);
        result = 1;
    }

    // Try to load a new frame from the video packet queue
    bool goodop=false;
    AVFrame *_n_frame = avcodec_alloc_frame();
    while (!frameDone && (packet = this->DEQUEUE(VIDEO)) != NULL)
    {
        if (packet == (AVPacket*)-1) return -1;

        goodop=true;

        _s_pts = packet->pts;
        avcodec_decode_video2(_codec_context_video, _n_frame, &frameDone, packet);
        av_free_packet(packet);

        if (packet->dts == AV_NOPTS_VALUE)
        {
            if (_n_frame->opaque && *(uint64_t*)_n_frame->opaque != AV_NOPTS_VALUE) pts = (double) *(uint64_t*)_n_frame->opaque;
            else pts = 0;
        }
        else pts = (double) packet->dts;

        pts *= av_q2d(_codec_context_video->time_base);

    }

    if (frameDone)
    {
        // if a frame was loaded scale it to the current texture frame buffer, but also set the pts so that it won't be copied to the texture until it's time
        sws_scale(sws_ctx,_n_frame->data, _n_frame->linesize, 0, _codec_context_video->height, _rgb_frame->data, _rgb_frame->linesize);


        double nts = 1.0/av_q2d(_codec_context_video->time_base);
        _currentPts = (uint64_t) (pts*nts);

    }

    avcodec_free_frame(&_n_frame);
    av_free(_n_frame);

    if(!goodop){
        ended=true;
    }
}
}

return result;
}

I'll be waiting for answers, thanks.

Upvotes: 5

Views: 7722

Answers (3)

Mauricio
Mauricio

Reputation: 21

I had a memory leak problem either. For me, the deallocation worked when I included the following commands:

class members:

AVPacket avpkt;
AVFrame *frame;
AVCodecContext *avctx;
AVCodec *codec;

constructor:

av_init_packet(&avpkt);
avcodec_open2(avctx, codec, NULL);
frame = avcodec_alloc_frame();

destructor:

av_free_packet(&avpkt);
avcodec_free_frame(&frame);
av_free(frame);
avcodec_close(avctx);

Upvotes: 2

bytedreamer
bytedreamer

Reputation: 142

I had similar routine using FFmpeg that would leak memory. I found a resolution by deallocating memory for the frame and packet objects for each call to avcodec_decode_video2.

In your code the packet object is freed, however the frame is not. Adding the following lines before avcodec_decode_video2 should resolve the memory leak. I found that it's safe to call avcodec_free_frame on a frame object that is already deallocated. You could remove the allocation of the frame before the while loop.

avcodec_free_frame(&_n_frame);
_n_frame = avcodec_alloc_frame();
avcodec_decode_video2(_codec_context_video, _n_frame, &frameDone, packet);

Upvotes: 1

Sarin Sukumar
Sarin Sukumar

Reputation: 83

i also had the same problem. According to the ffplay.c you should call

av_frame_unref(pFrame);
avcodec_get_frame_defaults(pFrame);

after every sw_scale call. this will free up all malloc during decode.

Upvotes: 1

Related Questions