Trykowo
Trykowo

Reputation: 55

Passing bytes to ffmpeg in python with io

Sorry, new to stackoverflow

Just wondering if it's possible to pass byte data from io.
I'm trying to extract frames from a gif with ffmpeg then use Pillow to resize it.
I know you can extract frames from a gif with Pillow, but sometimes it butchers certain gifs. So I'm using ffmpeg as a fix.
As for why I'd like the gif to be read from memory is because I'm going to change this so gifs from urls will be wrapped in Bytesio instead of saving.
As for why I have the extra Pillow code, I did successfully get it working by passing an actual filename into the ffmpeg command.

original_pil = Image.open("1.gif")

bytes_io = open("1.gif", "rb")
bytes_io.seek(0)

ffmpeg = 'ffmpeg'

cmd = [ffmpeg,
       '-i', '-',
       '-vsync', '0',
       '-f', 'image2pipe',
       '-pix_fmt', 'rgba',
       '-vcodec', 'png',
       '-report',
       '-']

depth = 4
width, height = original_pil.size
buf_size = depth * width * height + 100
nbytes = width * height * 4

proc = SP.Popen(cmd, stdout=SP.PIPE, stdin=SP.PIPE, stderr=SP.PIPE, bufsize=buf_size, shell=False)
out, err = proc.communicate(input=bytes_io.read(), timeout=None)

FFMPEG report:

ffmpeg started on 2021-06-07 at 18:58:14
Report written to "ffmpeg-20210607-185814.log"
Command line:
ffmpeg -i - -vsync 0 -f image2pipe -pix_fmt rgba -vcodec png -report -
ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
  configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --ena  WARNING: library configuration mismatch
  avcodec     configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enab  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
Splitting the commandline.
Reading option '-i' ... matched as input url with argument '-'.
Reading option '-vsync' ... matched as option 'vsync' (video sync method) with argument '0'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'image2pipe'.
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'rgba'.
Reading option '-vcodec' ... matched as option 'vcodec' (force video codec ('copy' to copy stream)) with argument 'png'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option vsync (video sync method) with argument 0.
Applying option report (generate a report) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url -.
Successfully parsed a group of options.
Opening an input file: -.
[NULL @ 0x55b59c38f7c0] Opening 'pipe:' for reading
[pipe @ 0x55b59c390240] Setting default whitelist 'crypto'
[gif @ 0x55b59c38f7c0] Format gif probed with size=2048 and score=100
[AVIOContext @ 0x55b59c398680] Statistics: 4614093 bytes read, 0 seeks
pipe:: Input/output error

Upvotes: 3

Views: 7992

Answers (2)

jns
jns

Reputation: 1382

PyAV can handle bytes, see av.open. This should be more efficient than opening a subprocess. It's straight forward:

video_bytes = ... # from file/server/whatever
bio = io.BytesIO(video_bytes)
av_container = av.open(bio, mode="r")

The library also offers building filtergraphs and other features, check the docs if it covers everything you need.

Upvotes: 1

Rotem
Rotem

Reputation: 32124

For a single image your code is working fine.
It looks like you are missing proc.wait() at the end and that's it.

For multiple images, you may take a look at my post here.
You may simplify the code, for working with images.

I made few changes to your code, to make it more elegant (I think):

  • You don't need '-vsync', '0' argument.
  • I replaced '-' with 'pipe:' (I think it's more clear).
  • You don't need to set bufsize unless you know that the default is too small.
  • I removed stderr=SP.PIPE, because I wonted to see the FFmpeg log in the console.
  • I added proc.wait() after proc.communicate.

The code sample starts by building synthetic GIF image file for testing.


Here is the code sample:

import subprocess as sp
import shlex
from PIL import Image
from io import BytesIO

# Build synthetic image tmp.gif for testing
sp.run(shlex.split(f'ffmpeg -y -f lavfi -i testsrc=size=128x128:rate=1:duration=1 tmp.gif'))

original_pil = Image.open('tmp.gif')

bytes_io = open('tmp.gif', "rb")
bytes_io.seek(0)

ffmpeg = 'ffmpeg'

cmd = [ffmpeg,
       '-i', 'pipe:',
       #'-vsync', '0',
       '-f', 'image2pipe',
       '-pix_fmt', 'rgba',
       '-vcodec', 'png',
       '-report',
       'pipe:']

proc = sp.Popen(cmd, stdout=sp.PIPE, stdin=sp.PIPE)
out = proc.communicate(input=bytes_io.read())[0]

proc.wait()

bytes_io_png = BytesIO(out)
img = Image.open(bytes_io_png)
img.show()

Output:
enter image description here


Passing multiple images:

In case there are multiple images, you can use proc.communicate only if all the images are in the RAM.
Instead of grabbing all the images to the RAM, and pass the image to FFmpeg, you better use a writer thread and a for loop.

I tried to pass PNG images, but it was too messy.
I changed the code to pass images in RAW format.
The advantage of RAW images is that the size in bytes of all the images is known form advance.

Here is a code sample (not using BytesIO):

import numpy as np
import subprocess as sp
import shlex
from PIL import Image
import threading


# Write gif images to stdin pipe.
def writer(stdin_pipe):
    # Write 30 images to stdin pipe (for example)
    for i in range(1, 31):
        in_file_name = 'tmp' + str(i).zfill(2) + '.gif'

        with open(in_file_name, "rb") as f:  
            proc.stdin.write(f.read())  # Write bytes to stdin pipe

    stdin_pipe.close()


# Build 30 synthetic images tmp01.gif, tmp02.gif, ..., tmp31.gif for testing
sp.run(shlex.split(f'ffmpeg -y -f lavfi -i testsrc=size=128x128:rate=1:duration=30 -f image2 tmp%02d.gif'))


original_pil = Image.open("tmp01.gif")
depth = 4
width, height = original_pil.size
nbytes = width * height * 4


ffmpeg = 'ffmpeg'

cmd = [ffmpeg,
       '-i', 'pipe:',
       '-f', 'image2pipe',
       '-pix_fmt', 'rgba',
       '-vcodec', 'rawvideo',  # Select rawvideo codec
       '-report',
       'pipe:']


proc = sp.Popen(cmd, stdout=sp.PIPE, stdin=sp.PIPE)

thread = threading.Thread(target=writer, args=(proc.stdin,))
thread.start()  # Strat writer thread


while True:
    in_bytes = proc.stdout.read(nbytes)  # Read raw image bytes from stdout pipe.
    raw_imag = np.frombuffer(in_bytes, np.uint8).reshape([height, width, 4])

    img = Image.fromarray(raw_imag)
    img.show()

    # Break the loop when number of bytes read is less then expected size.
    if len(in_bytes) < nbytes:
        break

proc.wait()
thread.join()

Upvotes: 3

Related Questions