Lawallful
Lawallful

Reputation: 31

Is there a way in iOS to play a WebRTC "call" as a "media" sound?

I'm currently developing an app using WebRTC for live-streaming. However, when the connection is made, WebRTC per default is interpreted as a call by the iOS device. This implies a lot of audio-related unintuitive behaviours for my users. (Sound can't go to 0, not the same volume than classic videos, I can't hide the call volume HUD when user changes the volume...)

So I'm looking for a native way to "route" (or change) the category of the audio from "call" audio to a standard "media" audio.

Is there any way to do this?

Upvotes: 3

Views: 1461

Answers (1)

James M
James M

Reputation: 183

You can change the category of the audio from "call" audio to something more like standard "media" audio by routing the WebRTC stream through a Web Audio API AudioContext instead of by playing it through an HTML5 audio element.

See createMediaStreamSourceNode and https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/webrtc-integration.html for examples.

Note that on iOS, the AudioContext starts in a suspended state and must be 'resumed' within the context of a user gesture event (e.g. pointer/touch event) before anything becomes audible. The AudioContext also returns to a suspended state in the following scenarios:

  1. Disconnecting headphones.
  2. Visiting another browser tab and playing audio there.

This means the WebRTC streams will go quiet in those situations, and the AudioContext has to be resumed before they become audible again.

Upvotes: 0

Related Questions