Ganzolo
Ganzolo

Reputation: 1394

Stream Audio in WebRTC during webRTC calls

My application uses Google WebRTC framework to make audio calls and that part work. However I would like to find a way to stream an audio file during a call.

Scenario : A calls B B answer and play a music A hear this music

I've downloaded entire source code of WebRTC and trying to understand how it works. On the iOS part it seems that it is using Audio Unit. I can see a voice_processing_audio_unit file. I would (maybe wrongly) assume that I need to create a custom audio_unit that is reading its data from a file?

Does anyone has an idea in which direction to go?

Upvotes: 0

Views: 998

Answers (1)

Ganzolo
Ganzolo

Reputation: 1394

After fighting an entire week with this issue. I finally manage to find a solution for this problem.

By editing WebRTC Code, I was able to get to the level of AudioUnits and in the AudioRenderingCallback, catch the io_data buffer.

This callback is called every 10ms to get the data from the mic. Therefor in this precise callback I was able to change this io_data buffer to put my own audio data.

Upvotes: 1

Related Questions