Reputation: 33
I'm exploring different ways to achieve a low latency screen-share using WebRTC, and running into an issue. Any help would be much appreciated!
As of now, I'm able to capture/broadcast my Mac OS screen to localhost using FFmpeg, then pick up the stream and play it using other computers on my network.
Here's my FFmpeg commandline for capturing desktop video:
ffmpeg -f avfoundation -framerate 60 -capture_cursor 1 -i "1" -c:v h264_videotoolbox -realtime 1 -vsync 2 -b:v 5000k out777777.mp4
I was wondering if there were a way to utilize WebRTC (ideally the datachannel method) in order for a remote computer to pick up and play this UDP stream of my Desktop once two peers are connected via the Datachannel?
Thank you!
Upvotes: 2
Views: 1336
Reputation: 4242
ffmpeg itself can't help you with WebRTC, so you will either need to stitch things together or use a full WebRTC implementation.
Someone did implement screenshare using libx264 called webrtc-remote-screen that could be helpful!
If you want to build your own you will need
SDP Implementation (and signaling to transport it)
ICE Agent
DTLS Implementation
SCTP Implementation
Then you can send your frames over SCTP. Each of these stacks are pretty complicated, so you will need to dive into each problem individually.
Upvotes: 2