The42ndTurtle
The42ndTurtle

Reputation: 81

Processing Frames of Screen Recording in Node Usin FFMPEG

I am trying to capture my screen using Node.JS and FFMPEG, and I have gotten as far as saving an flv file, but I am unable to process the frames real-time.

My code so far is

const ffmpeg = require('ffmpeg-static');
const {spawn} = require('child_process');
const {createWriteStream} = require('fs');

const canvas = document.querySelector('#canvas');
const ctx = canvas.getContext('2d');

const process = spawn(
  ffmpeg,
  ["-f", "gdigrab", "-framerate", "30", "-i", "desktop", '-crf', '0', '-preset', 'ultrafast', '-f', 'flv', '-'],
  { stdio: "pipe" }
);


const stream = process.stdout;

const file = createWriteStream('capture.flv');
stream.pipe(file);

stream.on('data', chunk => {
  const base64 = chunk.toString('base64');
  const data = `data:image/png;base64,${base64}`;

  const image = new Image();
  image.src = data;
  ctx.drawImage(image, 0, 0);
});

The output.flv file that is created is fine, but the image getting created in the stream does not seem to be a valid image. When I log the base64 and try just turning the single string into an image, it appears to be invalid image data. I want to use the stream to capture each frame of the stream.

Upvotes: 0

Views: 1341

Answers (1)

The42ndTurtle
The42ndTurtle

Reputation: 81

I found out the solution. Changing the -f flag from flv to mjpeg did the trick. This streams it as a series of jpeg images, so the chunks could be converted to a base64 string and then into a jpg image.

Upvotes: 1

Related Questions