Gilberto Ugolini
Gilberto Ugolini

Reputation: 13

How to execute a ffmpeg code on a GPU without using the command line?

We have written a short code in C code to read a video file, using common libraries as libavcodec, libavformat, etc.

The code is running smoothly but only using the CPU resources. We'd need to run the code on the GPU (Nvidia GeForce 940MX and 1080Ti). Is there a way to force the code to be run on the GPU?

While using the command line (e.g., ffmpeg -hwaccel cuvid -i vid.mp4 out.avi) things are fine, we are not able to have it working on the GPU from the source code.

We are working with Ubuntu 18.04, and ffmpeg correctly compiled with CUDA 9.2

Upvotes: 1

Views: 2035

Answers (1)

the kamilz
the kamilz

Reputation: 1988

There are pretty good examples for using libav (ffmpeg) for encoding and decoding video at https://github.com/FFmpeg/FFmpeg/tree/master/doc/examples.

For what you need is demuxing_decoding.c example and change the lines 166 which is:

/* find decoder for the stream */
dec = avcodec_find_decoder(st->codecpar->codec_id);

with

/* find decoder for the stream */
if (st->codecpar->codec_id == AV_CODEC_ID_H264)
{
    dec = avcodec_find_decoder_by_name("h264_cuvid");
}
else if (st->codecpar->codec_id == AV_CODEC_ID_HEVC)
{
    dec = avcodec_find_decoder_by_name("hevc_cuvid");
}
else
{
    dec = avcodec_find_decoder(st->codecpar->codec_id);
}

add/change lines for other formats. And make sure your FFmpeg compiled with --enable-cuda --enable-cuvid

In my tests I got error comes from line 85: because nvdec (hevc_cuvid) uses p010 internal format for 10bit (input is yuv420p10). Which means decoded frame will be either NV12 pixel format or P010 depending on bit depth. I hope you are familiar with pixel formats.

Hope that helps.

Upvotes: 1

Related Questions