Monalisa
Monalisa

Reputation: 131

Gstreamer webrtcbin working sample pipeline

Can someone show up to date webrtcbin pipeline? At the moment i use these pipelines and they do not work.

Send:

gst-launch-1.0 webrtcbin bundle-policy=max-bundle name=sendrecv  stun-server=stun://stun.l.google.com:19302 audiotestsrc is-live=true wave=red-noise ! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! application/x-rtp,media=audio,encoding-name=OPUS,payload=97 ! sendrecv.

Receive:

gst-launch-1.0 webrtcbin bundle-policy=max-bundle name=sendrecv  stun-server=stun://stun.l.google.com:19302 ! rtpopusdepay ! opusdec ! audioconvert ! autoaudiosink async=false

Thanks!!!

Upvotes: 12

Views: 12049

Answers (2)

bain
bain

Reputation: 2102

This is possible with the webrtcsink element, which wraps webrtcbin, and includes a builtin signalling server and web UI:

git clone https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs.git
# This requires a Rust compiler
cargo build --release -p gst-plugin-webrtc --lib
export GST_PLUGIN_PATH=`pwd`/target
cd net/webrtc
# This requires npm
npm --prefix gstwebrtc-api install
npm --prefix gstwebrtc-api run build
gst-launch-1.0 videotestsrc ! webrtcsink run-signalling-server=true run-web-server=true

The web server will default to port 8080, so visit http://localhost:8080 and you should see a web page, click on the video stream ID and it will open the WebRTC stream containing output from the GStreamer videotestsrc element.

(For debug and test, you could run the web server separately, using gst-launch-1.0 argument run-web-server=false, and then cd net/webrtc/gstwebrtc-api && npm start to run the webserver in a terminal. Note that, if you do run the webserver like this, it will bind to http://localhost:9090 by default - see webpack.config.cjs - not port 8080 which is used in the Rust code).

For further information on webrtcbin, webrtcsink and webrtcsrc, see WebRTC Plumbing with GStreamer

Upvotes: 1

PilotInPyjamas
PilotInPyjamas

Reputation: 987

The short answer is, you will not be able to run a WebRTC pipeline from the commandline. Another user already posted some example code that you will need to adapt for your use case: https://github.com/centricular/gstwebrtc-demos

The commandline example is missing a critical piece of WebRTC: the signalling server. The sending and receiving ends of the pipeline need to be able to swap two bits of information before a connection is established via WebRTC: The SDP, and the ICE candidates. This will allow them to negotiate the format and the parameters of the stream (SDP) and a way to connect to each other via a peer to peer connection (ICE). Without this, a connection cannot be established.

Note that the WebRTC specification does not specify how the signalling server needs to be implemented. It is perfectly valid to exchange the SDP and ICE candidates via email for example, however it makes more sense for the signalling server to be an actual server.

Upvotes: 4

Related Questions