Reputation: 810
How one can use decoded AudioBuffer
data to be set as a source for HTMLAudioElement
?
Let us assume we have an HTMLAudioElement
:
let audio = new Audio();
And we also were able to receive and decode audio data:
let context = new AudioContext();
let source = context.createBufferSource(); //this represents the audio source. We need to now populate it with binary data.
Api.getAudioFile(url).then((data) => {
context.decodeAudioData(data, (buffer) => {
source.buffer = buffer;
}, null);
});
How one can use source
as a source for an audio
? I assume I have to create a MediaStream
object for that matter, but it's not quite clear how to do it.
Upvotes: 3
Views: 3417
Reputation: 137113
The easiest solution is obviously to set the src of your AudioElement to the url
you are fetching the data from.
In your case, it seems it's not that easy (because you need some credentials to passed along the request), in that case, if you can, make your response a Blob instead of an ArrayBuffer, and then simply set your AudioElement's src to a blobURI you'd have created from this Blob.
const audio = document.querySelector('audio');
const url = "https://dl.dropboxusercontent.com/s/8c9m92u1euqnkaz/GershwinWhiteman-RhapsodyInBluePart1.mp3";
fetch(url)
.then(r => r.blob()) // request as Blob
.then(blob => audio.src = URL.createObjectURL(blob))
<audio controls></audio>
And if you can't modify your request, or just need that ArrayBuffer e.g for the web audio context, then simply create a Blob from it:
const audio = document.querySelector('audio');
const url = "https://dl.dropboxusercontent.com/s/8c9m92u1euqnkaz/GershwinWhiteman-RhapsodyInBluePart1.mp3";
fetch(url)
.then(r => r.arrayBuffer()) // request as ArrayBuffer
.then(buffer => {
audio.src = URL.createObjectURL(new Blob([buffer]));
const ctx = new AudioContext();
return ctx.decodeAudioData(buffer);
})
.then(console.log);
<audio controls></audio>
Upvotes: 7