yi chen
yi chen

Reputation: 241

Streamed video with WebSockets and rendered with WebGL

I have a idea that rendering video data streamed with WebSockets in WebGL. I know we can import video data (a frame) as a texture from the <video> tag to WebGL. However, I'm thinking if I can use sockets instead so I can have more control of it. Is this idea possible?

Thanks, Yi

Upvotes: 1

Views: 1972

Answers (1)

eepp
eepp

Reputation: 7575

This guy implements simple webcam sharing using WebSocket. He is actually sending one frame at a time as a Base64-encoded JPEG image over a WebSocket and assigning it to the src attribute of an <img> on the browser side as soon as it's received.

According to the page, he's achieving 640×480 @ 30 fps on Chrome and 320×240 @ 30 fps on iPhone/iPad.

I don't see why you couldn't use the image data afterwards to create a WebGL texture:

var textureImg = document.getElementById("myTextureImg");
var texture = gl.createTexture();
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, textureImg);
gl.generateMipmap(gl.TEXTURE_2D);

and so on.

Upvotes: 3

Related Questions