gamemn02
gamemn02

Reputation: 75

Encoding RGB frames using x264 and AVCodec in C

I have RGB24 frames streamed from camera and i want to encode them into h264 ,i found that AVCodec and x264 can do so, the problem is x264 as default accepts YUV420 as input so what i wrote was a program which convert RGB frames to YUV420 .that was by sws_scale function .this works well except that it does not satisfy the required FPS because the converting (RGB->YUV420) takes time.

This is how i setup my encoder context :

videoStream->id = 0;
vCodecCtx = videoStream->codec;

vCodecCtx->coder_type       = AVMEDIA_TYPE_VIDEO;
vCodecCtx->codec_id         = AV_CODEC_ID_H264;
vCodecCtx->bit_rate         = 400000;
vCodecCtx->width            = Width;
vCodecCtx->height           = Height;
vCodecCtx->time_base.den    = FPS;
vCodecCtx->time_base.num    = 1;
//vCodecCtx->time_base      = (AVRational){1,};
vCodecCtx->gop_size         = 12;
vCodecCtx->max_b_frames     = 1;
vCodecCtx->pix_fmt          = AV_PIX_FMT_YUV420P;

if(formatCtx->oformat->flags & AVFMT_GLOBALHEADER)
    vCodecCtx->flags |= CODEC_FLAG_GLOBAL_HEADER;

av_opt_set(vCodecCtx->priv_data, "preset", "ultrafast", 0);
av_opt_set(vCodecCtx->priv_data, "profile", "baseline", AV_OPT_SEARCH_CHILDREN);

if (avcodec_open2(vCodecCtx, h264Codec, NULL) < 0){
    return 0;
}

when i changes AV_PIX_FMT_YUV420P to AV_PIX_FMT_RGB24 ,avcodec_open2 will fail. i read that there is a version of libx264 for RGB called libx264rgb but i even dont know whether i have to rebuild x264 with enabling this option or to download another source or i have to do it programmatically with the first x264 lib.

the question is how to enable RGB as input to libx264 to use with libavcodec in C .or how to make the encoding or sws_scale more fast .

Edit:

How i built ffmpeg :

NDK=D:/AndroidDev/android-ndk-r9
PLATFORM=$NDK/platforms/android-18/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/windows-x86_64

GENERAL="\
--enable-small \
--enable-cross-compile \
--extra-libs="-lgcc" \
--arch=arm \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--extra-cflags="-I../x264/android/arm/include" \
--extra-ldflags="-L../x264/android/arm/lib" "


MODULES="\
--enable-gpl \
--enable-libx264"

function build_ARMv6
{
  ./configure \
  --target-os=linux \
  --prefix=./android/armeabi \
  ${GENERAL} \
  --sysroot=$PLATFORM \
  --enable-shared \
  --disable-static \
  --extra-cflags=" -O3 -fpic -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfp -marm -march=armv6" \
  --extra-ldflags="-lx264 -Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
  --enable-zlib \
  ${MODULES} \
  --disable-doc \
  --enable-neon

  make clean
  make
  make install
}

build_ARMv6

echo Android ARMEABI builds finished

How i built x264 :

NDK=D:/AndroidDev/android-ndk-r9
PLATFORM=$NDK/platforms/android-18/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/windows-x86_64
PREFIX=./android/arm

function build_one
{
  ./configure \
  --prefix=$PREFIX \
  --enable-static \
  --enable-pic \
  --host=arm-linux \
  --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
  --sysroot=$PLATFORM

  make clean
  make
  make install
}

build_one

echo Android ARM builds finished

Upvotes: 5

Views: 5837

Answers (2)

Dalen
Dalen

Reputation: 4236

Write your own RGB2YUV decoder.

Get pixel map from the frame and run it through your function. No scaling, no nothing, just one for loop.

There are easy formulae to convert RGB888 to YCbCr (YUV4:4:4).

But AV/FFMpeg should be able to do it for you easily.

For YUV420 you'll need to get whole 4:4:4 Y channel, and interpolate every 4 pixels into U and V , using either mean or gaussian, to get 4:2:0.

Like this:

This code expects ABC4:4:4 CB or Cr channel and returns its ABC4:2:0 version of it.

#define uint8 unsigned char
uint8 *ABC_444_to_420 (uint8 *src, uint8 *dst, int w, int h) {
    int dpos, x, y, pl1, pl2;
    dpos = 0;
    for (x=0; x<h; x+=2) {
        for (y=0; y<w; y+=2) {
            pl1 = x*w+y; pl2 = (x+1)*w+y;
            dst[dpos] = (src[pl1]+src[pl1+1]+src[pl2]+src[pl2+1])>>2;
            dpos += 1;
            }
        }
    return dst;
    }

So, you get YUV444 from RGB, then run U and V through code above separately.

If you cannot find appropriate function from AV/FFMpeg that will convert RGB to YCbCr take a look at:

https://msdn.microsoft.com/en-us/library/windows/desktop/dd206750%28v=vs.85%29.aspx

You may, of course, put whole process into one passthrough at RGB to YCbCr conversion, to get ABC4:2:0 out directly.

Do not work directly on AV/FFMpeg frame pixels, get your self raw data in an array. Then you'll get maximum speed, then construct AV/FFMpeg Frame back from the results.

Upvotes: 0

nobody555
nobody555

Reputation: 2374

To use RGB pixel formats (AV_PIX_FMT_BGR0, AV_PIX_FMT_BGR24, AV_PIX_FMT_RGB24) with libx264 in libavcodec you need:

  1. use libavcodec from ffmpeg project and not from libav project because currently it is only available in there;
  2. make sure that libavcodec was compiled with libx264rgb (CONFIG_LIBX264RGB_ENCODER) which as I understand would be enabled if you would use new enough 8-bit libx264 (configured with --enable-libx264);
  3. use avcodec_find_encoder_by_name("libx264rgb") instead of avcodec_find_encoder(AV_CODEC_ID_H264).

Upvotes: 2

Related Questions