Reputation: 6467
I have a audio visualizer written in JS which draws on a <canvas>
element.
Is it possible (without screen-capture) to turn that <canvas>
into a (realtime) video stream? Perhaps somehow write it to a socket directly.
the JS uses THREE.js for rendering.
Preferrably I'd like to be able to run this on a webserver, it's probably not possible to do this without actually using a browser, but if it is, I'd be very happy to hear about it ;)
Upvotes: 1
Views: 2557
Reputation: 6467
Using the info from Blindman67 I've managed to figure out a way of achieving the desired result.
I will end up using PhantomJS and have it write images to a /dev/stdout (or other socket) and use ffmpeg to turn that into a videostream. (sort of as described in this question)
I will also run a test using Whammy but as described in the github that might not produce the desired result; only 1 way to find out.
Edit: I will also try the suggestion from kaiido to use WebRTC
Upvotes: 3