Reputation: 6294
I get a MediaStream
containing audio data from WebRTC
. Let's call this stream srcStream
.
If I have my HTML
<audio id="audio" controls autoplay></audio>
And I run
audioEl = document.querySelector("#audio")
audioEl.srcObject = srcStream
I can hear the audio, and I can see the audio element starting to count the number of seconds.
However, I get multiple audio streams, so I would like to do something more general and join all these streams to a single stream. If I run
audioCtx = new AudioContext()
dst = audioCtx.createMediaStreamDestination()
audioEl.srcObject = dst.stream
src = audioCtx.createMediaStreamSource(srcStream);
src.connect(dst)
The audio shows as playing, but I can't hear any audio played.
Is there a problem with how I create my destination?
Upvotes: 2
Views: 997
Reputation: 6294
This is a known bug in Chrome. The solution is to additionally attach the stream to a new audio object, even if it's unused
new Audio().srcObject = srcStream;
Upvotes: 5
Reputation: 6048
Perhaps check the console for messages about not being able to play due to CORS restrictions. For WebAudio to be play the audio, you have to have CORS set up correctly on the audio tag source to allow access.
Upvotes: 0