Web Workers
Although WebCodecs API already runs in its own thread, and the renderers are very fast, you might still want to run them in a dedicated Web Worker, so other tasks on the main thread won't affect its responsiveness.
Only WebGLVideoFrameRenderer and BitmapVideoFrameRenderer are supported in Web Worker. There are two ways to render the frames:
Method A: Create OffscreenCanvas directly
When their constructors are called without arguments, they will create an OffscreenCanvas object internally.
This OffscreenCanvas is not bound to a <canvas> element. To display the frames on the main thread, you need to use transferToImageBitmap to transfer the bitmap data from the worker and draw it onto a regular canvas:
// worker.js - Renderer worker
import {
WebGLVideoFrameRenderer,
WebCodecsVideoDecoder,
} from "@yume-chan/scrcpy-decoder-webcodecs";
// Create renderer with automatic OffscreenCanvas
const renderer = new WebGLVideoFrameRenderer();
self.addEventListener("message", async (e) => {
const { codec, videoStream } = e.data;
// Set up decoder with the renderer
const decoder = new WebCodecsVideoDecoder({
codec,
renderer
});
// Handle size changes
decoder.sizeChanged(({ width, height }) => {
postMessage({ type: 'size', width, height });
});
// Process video stream
await videoStream.pipeTo(decoder.writable).catch((error) => {
console.error('Decode error:', error);
});
});
// Periodically transfer frames to main thread
function transferFrames() {
// Get the current frame as ImageBitmap
const imageBitmap = renderer.canvas.transferToImageBitmap();
// Send to main thread
postMessage({ type: 'frame', imageBitmap }, [imageBitmap]);
// Continue transferring frames
requestAnimationFrame(transferFrames);
}
// Start frame transfer
transferFrames();
// main-thread.js - Main thread to display frames
const canvas = document.getElementById('displayCanvas');
const ctx = canvas.getContext('2d');
// Set up worker
const worker = new Worker('worker.js');
// Handle size updates
worker.addEventListener('message', (e) => {
if (e.data.type === 'size') {
canvas.width = e.data.width;
canvas.height = e.data.height;
} else if (e.data.type === 'frame') {
// Draw the received ImageBitmap onto the canvas
ctx.drawImage(e.data.imageBitmap, 0, 0);
// Clean up the ImageBitmap after drawing
e.data.imageBitmap.close();
}
});
// Send video stream to worker when ready
worker.postMessage({
codec: ScrcpyVideoCodecId.H264,
videoStream: videoPacketStream
});
This approach allows you to render video frames in a worker and efficiently display them on the main thread by transferring image bitmaps between threads.
Method B: Create OffscreenCanvas from a <canvas> element
An OffscreenCanvas object can be created by calling transferControlToOffscreen() method on an existing <canvas> element.
It can then be postMessage to the worker, and pass into renderer constructors. When the renderer draws on the OffscreenCanvas, the content will be displayed on the source <canvas> element automatically.
<!-- index.html -->
<canvas id="canvas"></canvas>
- JavaScript
- TypeScript
import { ScrcpyVideoCodecId } from "@yume-chan/scrcpy";
const canvas = document.getElementById("canvas");
const offscreenCanvas = canvas.transferControlToOffscreen();
const worker = new Worker("worker.js");
worker.postMessage(
{
codec: ScrcpyVideoCodecId.H264,
canvas: offscreenCanvas,
stream: videoPacketStream,
},
[offscreenCanvas, videoPacketStream],
);
worker.addEventListener("message", (e) => {
const { width, height } = e.data;
canvas.width = width;
canvas.height = height;
});
// index.js
import type { ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import { ScrcpyVideoCodecId } from "@yume-chan/scrcpy";
declare const videoPacketStream: ReadableStream<ScrcpyMediaStreamPacket>;
const canvas = document.getElementById("canvas");
const offscreenCanvas = canvas.transferControlToOffscreen();
const worker = new Worker("worker.js");
worker.postMessage(
{
codec: ScrcpyVideoCodecId.H264,
canvas: offscreenCanvas,
stream: videoPacketStream,
},
[offscreenCanvas, videoPacketStream],
);
worker.addEventListener("message", (e) => {
const { width, height } = e.data;
canvas.width = width;
canvas.height = height;
});
- JavaScript
- TypeScript
import {
WebGLVideoFrameRenderer,
WebCodecsVideoDecoder,
} from "@yume-chan/scrcpy-decoder-webcodecs";
self.addEventListener("message", (e) => {
const { codec, canvas, stream } = e.data;
const renderer = new WebGLVideoFrameRenderer(canvas);
const decoder = new WebCodecsVideoDecoder({ codec, renderer });
decoder.sizeChanged(({ width, height }) => {
postMessage({ width, height });
});
void stream.pipeTo(decoder.writable).catch((e) => {
console.error(e);
});
});
// worker.js
import type { ScrcpyVideoCodecId, ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import {
WebGLVideoFrameRenderer,
WebCodecsVideoDecoder,
} from "@yume-chan/scrcpy-decoder-webcodecs";
self.addEventListener("message", (e) => {
const { codec, canvas, stream } = e.data as {
codec: ScrcpyVideoCodecId;
canvas: OffscreenCanvas;
stream: ReadableStream<ScrcpyMediaStreamPacket>;
};
const renderer = new WebGLVideoFrameRenderer(canvas);
const decoder = new WebCodecsVideoDecoder({ codec, renderer });
decoder.sizeChanged(({ width, height }) => {
postMessage({ width, height });
});
void stream.pipeTo(decoder.writable).catch((e) => {
console.error(e);
});
});