Reputation: 11076
I have a raw uint8 pointer to a buffer containing I420 formatted video data, and a buffer size. I also know the frame width / height. I want to feed the data into a library which can create video frames via this function signature:
Copy(int width, int height,
const uint8_t* data_y, int stride_y,
const uint8_t* data_u, int stride_u,
const uint8_t* data_v, int stride_v)
Is there some simple pointer arithmetic to resolve this?
Upvotes: 2
Views: 983
Reputation: 207758
The best site I know for describing the various YUV and RGB video formats is FOURCC - the YUV descriptions are here.
The I420 format you refer to is described here. That means that your raw data will be organised like this:
I find ffmpeg
is the very best tool for generating YUV data to encode into and decode out of your own software. Here are some tips for working with ffmpeg
and raw YUV.
You can get a list of the pixel formats it supports with:
ffmpeg -pix_fmts
So, to find yours, I looked for something with 420 in it:
ffmpeg -pix_fmts | grep 420p
IO... yuv420p 3 12 8-8-8
so I know I need -pix_fmt yuv420p
to encode or decode your data. I can also get a decent description of how that is laid out by checking the ffmpeg
source here. The 12 above means 12 bits per pixel.
Then I wanted to:
ffmpeg
dd
and extract the Y, U and V channels with IMageMagickffmpeg
so I made the following bash
script:
#!/bin/bash
################################################################################
# User-editable values
################################################################################
# Define WIDTH and HEIGHT of the frame we want to generate...
# ... so we are consistent all the way through
W=640
H=480
PIX_FMT="yuv420p"
################################################################################
# Derived values - do not edit
################################################################################
BASENAME="${PIX_FMT}-${W}x${H}"
FILENAME="${BASENAME}.raw"
PNGNAME="${BASENAME}.png"
UVW=$((W/2)) # width of U plane, same as V plane
UVH=$((H/2)) # height of U plane, same as V plane
YBYTES=$((H*W)) # bytes in Y plane
UBYTES=$((UVW*UVH)) # bytes in U plane, same as in V plane
# Generate a sample frame
echo "Generating sample: ${FILENAME}, and viewable PNG equivalent: ${PNGNAME}"
ffmpeg -y -f lavfi -i testsrc=size=${W}x${H}:rate=1:duration=1 -vcodec rawvideo -pix_fmt "$PIX_FMT" -f image2pipe - > "$FILENAME"
ffmpeg -y -f lavfi -i testsrc=size=${W}x${H}:rate=1:duration=1 "$PNGNAME"
# Check its size in bytes
ls -l "$FILENAME"
# Extract Y plane from sample into "Y.png" using ImageMagick
echo "Extracting Y plane into Y.png"
dd if="$FILENAME" bs=$YBYTES count=1 | magick -depth 8 -size ${W}x${H} gray:- Y.png
# Extract U plane from sample into "U.png" using ImageMagick
echo "Extracting U plane into U.png"
dd if="$FILENAME" bs=1 skip=$YBYTES count=$UBYTES | magick -depth 8 -size ${UVW}x${UVH} gray:- U.png
# Extract V plane from sample into "V.png" using ImageMagick
echo "Extracting V plane into V.png"
dd if="$FILENAME" bs=1 skip=$((YBYTES+UBYTES)) count=$UBYTES | magick -depth 8 -size ${UVW}x${UVH} gray:- V.png
# Recombine with ImageMagick
echo "Combining Y.png, U.png, V.png into result.png"
magick Y.png \( U.png v.png -resize 200% \) -set colorspace YUV -combine result.png
# Create a PNG from the YUV420p raw data just the same with 'ffmpeg'
echo "Create PNG from the YUV420p raw data as 'extracted.png'"
ffmpeg -y -f rawvideo -video_size 640x480 -pixel_format yuv420p -i - extracted.png < "$FILENAME"
That creates this image as PNG and as I420 data for you to test with:
and these Y, U and V planes:
Upvotes: 2