Logo

dev-resources.site

for different kinds of informations.

Using Electron to create videos (Canvas + FFmpeg)

Published at
2/14/2024
Categories
electron
javascript
ffmpeg
web
Author
yonatanbd
Categories
4 categories in total
electron
open
javascript
open
ffmpeg
open
web
open
Author
9 person written this
yonatanbd
open
Using Electron to create videos (Canvas + FFmpeg)

At Gling I faced the challenge of rendering captions with transparent bg from the browser Canvas to a video file. I had the code for painting the captions to the Canvas by keyframe (for playback purposes), but we needed to render and encode those frames into a file the user could use.

Using WebCodecs (Doesn't support transparent bg)

Initially, I tried using the new WebCodecs API to pass each Canvas image to a VideoFrame and encode them using VideoEncoder using a VP9 codec that supports transparency, And finally muxing the video stream into a webm file using webm-muxer.
That worked great (and relatively very fast), but then I saw that even though VP9 codec supports alpha channel (for the transparent background) it is not implemented in WebCodecs API (yet?).

So I had to use FFmpeg to process the raw frames into a codec and format the supports alpha channel.
But how do you transfer Canvas bitmap data to FFmpeg for encoding and muxing?

Part 1 - Creating frames image data

Assuming I need to create a video with a 60-second length, at 30 frames per second, that means I need to create 1,800 frames. For each frame, I should paint the right captions to the canvas, and convert it to raw image data.

// renderer.ts

function createFramesData(
  seconds: number, 
  framerate: number, 
  dimensions: { width: number, height: number },
  onFrame: (data: Uint8ClampedArray) => Promise<unknown>
) {
  const frames = 60 * 30;

  const canvas = new OffscreenCanvas(dimensions.width, dimensions.height);
  const ctx = canvas.getContext('2d', {
        alpha: true,
        willReadFrequently: true,
  }) as OffscreenCanvasRenderingContext2D;

  for (let i = 1; i <= frames; i++) {
    ctx.clearRect(0, 0, canvas.width, canvas.height);
    // The function that paints to the canvas by the frame number
    paintCaptionFrame(ctx, i);
    const rawFrameImage = ctx.getImageData(x, y, width, height).data;
    await onFrame(rawFramImage);
  }
}
Enter fullscreen mode Exit fullscreen mode

Part 2 - Sending raw frames to FFmpeg

Now it is the interesting part. We need to listen to the onFrame callback that sends the raw image data, and pipe it to a FFmpeg process. FFmpeg should do the magic: ingest raw frames, encode them to any codec we choose (one that supports alpha channel), and muxing it into a file.

// renderer.ts

const duration = 60;
const framerate = 30;
const dimensions = {
  width: 1920,
  height: 1080,
}
const processId = await startFFmpeg(framerate, dimensions);
await createFramesData( // The function from above
  duration, 
  framerate, 
  dimensions, 
  (data) => {
    return pipeFramesToFFmpegProcess(processId, data);
  }
);
await closeFFmpegPipe(processId);
Enter fullscreen mode Exit fullscreen mode

To start the FFmpeg process, we need to create an IPC call to the Electron Node.js backend, which will create an FFmpeg process.

To understand Electron IPC mechanism, there is an official Electron guide here: https://www.electronjs.org/docs/latest/tutorial/ipc

In short, we need to have a listener in the Node.js backend, that listens for IPC calls and executes FFmpeg, pipes data to a process and closes the process std input.

In main.ts Node.js Electron backend

// main.ts

import { ChildProcess, execFile } from 'child_process';

const processes = new Map<number, ChildProcess>();

ipcMain.on('exec', (_, executable, args) => {
  const child = execFile(
    executable,
    args.map((arg) => arg.toString()),
    { maxBuffer: 100 * 1024 * 1024 },
  );
  childProcesses.set(child.pid, child);
  return child.pid;
});

ipcMain.on('pipeIn', (_, pid, data: ArrayBuffer) => {
  return new Promise<void>((res, rej) => {
    const child = childProcesses.get(pid);
    if (!child?.stdin) {
      rej(new Error('No child process found with pid ' + pid));
      return;
    }
    child.stdin.write(Buffer.from(data), 'utf8', (error) => {
      if (error) {
        rej(error);
      } else {
        res();
      }
    });
  });
});

ipcMain.on('endPipeIn', (_, pid) => {
    return new Promise<void>((res, rej) => {
    const child = childProcesses.get(pid);
    if (!child?.stdin) {
      rej(new Error('No child process found with pid ' + pid));
      return;
    }
    child.stdin.end(() => {
      res();
      childProcesses.delete(pid);
    });
  });
});
Enter fullscreen mode Exit fullscreen mode

Then expose IPC calls in the renderer using a preload file:

// preload.ts

import { contextBridge, ipcRenderer } from 'electron/renderer';

contextBridge.exposeInMainWorld('api', {
  exec: (executable, args) => ipcRenderer.send('exec', executable, args),
  pipeIn: (pid, data) => ipcRenderer.send('pipeIn', executable, args),
  endPipeIn: (pid) => ipcRenderer.send('endPipeIn', pid),
});
Enter fullscreen mode Exit fullscreen mode

Part 3 - Connecting everything

Now we can execute FFmpeg, and send the raw frames to the FFmpeg process for encoding and muxing.

//renderer.ts

const duration = 60;
const framerate = 30;
const dimensions = {
  width: 1920,
  height: 1080,
}
const processId = await startFFmpeg(framerate, dimensions);
await createFramesData( // The function from above
  duration, 
  framerate, 
  dimensions, 
  (data) => {
    return pipeFramesToFFmpegProcess(processId, data);
  }
);
await closeFFmpegPipe(processId);

function createFramesData(
  seconds: number, 
  framerate: number, 
  dimensions: { width: number, height: number },
  onFrame: (data: Uint8ClampedArray) => Promise<unknown>
) {
  const frames = 60 * 30;

  const canvas = new OffscreenCanvas(dimensions.width, dimensions.height);
  const ctx = canvas.getContext('2d', {
        alpha: true,
        willReadFrequently: true,
  }) as OffscreenCanvasRenderingContext2D;

  for (let i = 1; i <= frames; i++) {
    ctx.clearRect(0, 0, canvas.width, canvas.height);
    // The function that paints to the canvas by the frame number
    paintCaptionFrame(ctx, i);
    const rawFrameImage = ctx.getImageData(x, y, width, height).data;
    await onFrame(rawFramImage);
  }
}

startFFmpeg(
  framerate: number, 
  dimensions: { width: number, height: number });
) {
  return window.api.exec('ffmpeg', [
    '-f',
    'rawvideo',
    '-pix_fmt',
    'rgba',
    '-s',
    `${dimensions.width}x${dimensions.height}`,
    '-r',
    framerate,
    '-i',
    '-',
    '-c:v',
    'libvpx-vp9',
    '-pix_fmt',
    'yuva420p',
    'out.webm',
  ]);
}

pipeFramesToFFmpegProcess(pid: number, data: Uint8ClampedArray) {
  return window.api.pipeIn(pid, data);
}

closeFFmpegPipe(pid: number) {
  return window.api.endPipeIn(pid);
}
Enter fullscreen mode Exit fullscreen mode

And that's all, at the end you will have a webm file with the video containing the frames created by the canvas.

  • Some things might be missing here (like waiting for the ffmpeg process to complete), I removed them to keep the example as simple as possible.

  • You don't have to use vp9 codec, there are other codecs there that have alpha channel (but not many)

We hire at Gling

I'm always looking for talented developers to join our small, but excellent, team. If you want to join us and build a product that users love, with a focus on video editing on the web, feel free to send me an email to [email protected]. We are fully remote and hire globally! Visit our website: https://gling.ai

ffmpeg Article's
30 articles in total
Favicon
Desvendando Subprocessos: Criando um Bot de Música com Go
Favicon
Video data IO through ffmpeg subprocess
Favicon
Wisper, ffmpeg을 활용한 비디오 자막 자동 생성
Favicon
Integrating MinIO notifications with your Node.js service, FFmpeg, and Mozilla convert API.
Favicon
Cliet-side WebM/MP4 export from React.js Canavs Animation using ffmpeg.wasm for an Upwork client
Favicon
Reduce bitrate using FFMPEG
Favicon
Add a Watermark to a Video Using VideoAlchemy
Favicon
No Bullshit Guide to Youtube shorts automation in NodeJS, OpenAI, Ollama, ElevanLabs & ffmpeg
Favicon
Building a Video Streaming Platform with Node.js, FFmpeg, and Next.js
Favicon
Record Windows Screen using ffmpeg and convert to time lapse video
Favicon
Introducing Comet: A Free, Cross-Platform Video Converter Powered by FFmpeg
Favicon
Compress, Convert and Trim Videos with Command Line
Favicon
เผื่อใครอยากทำ mp4 to gif แบบคมๆ
Favicon
How to generate thumbnails from video ?
Favicon
Convert .caf to mp3 by Directory
Favicon
FFMPEG
Favicon
Run ffmpeg within a Docker Container: A Step-by-Step Guide
Favicon
New to DEV.to - About me
Favicon
Streaming Video to AWS MediaConnect Using FFmpeg and SRT Protocol: A Complete Guide
Favicon
Displaying a video on a ESP32 powered SSD1306 OLED screen
Favicon
FFMPEG Libraries - RTSP Client Keep Alive
Favicon
From Pixels to Playbacks: Dominate Multimedia with FFmpeg in Python
Favicon
Access webcam by ffmpeg in Windows
Favicon
OSCAR 2022 sea surface velocity streamplot animation
Favicon
Mastering Video Previews: A Guide to Compressed Videos and Thumbnails
Favicon
Dall.E Image Gen, And Size Comparison Of Image Formats
Favicon
AIS vessel density maps with pyspark and h3 and animations with ffmpeg
Favicon
Using Electron to create videos (Canvas + FFmpeg)
Favicon
BMF 📹 + Hugging Face🤗, The New Video Processing BFFs
Favicon
Leveraging GPU Acceleration in BMF for High-Performance Video Processing

Featured ones: