Aftab Baig
Aftab Baig

Reputation: 291

WebRTC iOS Audio Chat

I am creating a voice only (no video) chat application. I have created my own node.js/socket.io based server for signaling.

For WebRTC, I am using the following pod: https://cocoapods.org/pods/WebRTC

I have been successful in creating peer connection, adding local stream, setting local/remote sdp, and send/receive ice candidates. The "didAddStream" delegate method is also called successfully having audio tracks but I am stuck here. I don't know what should I do with the audio track. What should be the next step? How would I send/receive audio on both sides?

Also, if I integrate CallKit, what changes do I need to make.

Upvotes: 3

Views: 5651

Answers (1)

Kyle Redfearn
Kyle Redfearn

Reputation: 2281

I got stuck on this one too. You have to retain the RTCMediaStream object in order for the audio to play. You don't need to do anything with the RTCAudioTrack, it will play automatically. I simply assign it to property so it can get retained. See my example here: https://github.com/redfearnk/WebRTCVideoChat/blob/master/WebRTCVideoChat/WebRTCClient.swift#L143

Upvotes: 3

Related Questions