Reputation: 83
I am currently working on a project that consists of a chart that shows audio levels picked up by another device. The charts are made through the flot API and I have zooming and selecting capabilities in order to select a time range on the chart and zoom into that selected region. My next step is to allow the user to listen to the audio that corresponds to that region of the chart. I have the audio files stored on a shared server and all of the files are in individual, minute by minute, RAW data files. I have no experience with using audio in a webpage and am currently struggling on how to complete this task. As far as I have found, the <audio>
HTML tag is incapable of processing RAW data files for playback. I have been looking into the Web Audio API but am confused about how it works and how to implement it.
My first question is how do I go about decoding RAW audio files from a server and displaying them on an HTML page for a client to listen to?
My second task is to grab all of the audio files corresponding to the selected range and combine them into one audio output. For example, if the client selected a time range of 1:00pm - 1:50pm, I would need to access 50 RAW data audio files each a minute in length. I would then want to combine them together to produce one single playback sound. Therefore, my second question is if anyone knows a way to accomplish this smoothly.
Thank you for any help anyone has to offer!
Upvotes: 4
Views: 13586
Reputation: 6048
An alternative that might be a bit easier with Web Audio, you can basically do the same as above, but don't use an Audio
element. If necessary convert the raw audio data to a float array, say, f
, and do something like this:
// Only need to do this once when setting up the page
let c = new AudioContext();
// Do this for each clip:
let b = new AudioBuffer({length: f.length, sampleRate: c.sampleRate});
b.copyToChannel(f, 0);
let s = new AudioBufferSourceNode(c, {buffer: b});
s.connect(c.destination);
s.start();
This is rough sketch of how to use Web Audio to do the playback. It can be refined to reuse AudioBuffers
. And you have to take care of calling s.start()
with the right time values. But I hope this is enough to get you started. If not, please ask more questions.
Upvotes: 1
Reputation: 9072
RAW files are already decoded PCM audio, but Audio
elements can't play PCM directly. You'll need to append a RIFF/WAV header to the PCM bytes first. Multiple RAW files could be combined, setting the total sample/frame length in the header. 50 minutes of decoded audio will take up a lot of memory in the browser, so keep an eye on that and measure/optimize accordingly.
initAudio()
async function initAudio() {
// specify your file and its audio properties
const url = 'https://dev.anthum.com/audio-worklet/audio/decoded-left.raw'
const sampleRate = 48000
const numChannels = 1 // mono or stereo
const isFloat = true // integer or floating point
const buffer = await (await fetch(url)).arrayBuffer()
// create WAV header
const [type, format] = isFloat ? [Float32Array, 3] : [Uint8Array, 1]
const wavHeader = new Uint8Array(buildWaveHeader({
numFrames: buffer.byteLength / type.BYTES_PER_ELEMENT,
bytesPerSample: type.BYTES_PER_ELEMENT,
sampleRate,
numChannels,
format
}))
// create WAV file with header and downloaded PCM audio
const wavBytes = new Uint8Array(wavHeader.length + buffer.byteLength)
wavBytes.set(wavHeader, 0)
wavBytes.set(new Uint8Array(buffer), wavHeader.length)
// show audio player
const audio = document.querySelector('audio')
const blob = new Blob([wavBytes], { type: 'audio/wav' })
audio.src = URL.createObjectURL(blob)
document.querySelector('#loading').hidden = true
audio.hidden = false
}
// adapted from https://gist.github.com/also/900023
function buildWaveHeader(opts) {
const numFrames = opts.numFrames;
const numChannels = opts.numChannels || 2;
const sampleRate = opts.sampleRate || 44100;
const bytesPerSample = opts.bytesPerSample || 2;
const format = opts.format
const blockAlign = numChannels * bytesPerSample;
const byteRate = sampleRate * blockAlign;
const dataSize = numFrames * blockAlign;
const buffer = new ArrayBuffer(44);
const dv = new DataView(buffer);
let p = 0;
function writeString(s) {
for (let i = 0; i < s.length; i++) {
dv.setUint8(p + i, s.charCodeAt(i));
}
p += s.length;
}
function writeUint32(d) {
dv.setUint32(p, d, true);
p += 4;
}
function writeUint16(d) {
dv.setUint16(p, d, true);
p += 2;
}
writeString('RIFF'); // ChunkID
writeUint32(dataSize + 36); // ChunkSize
writeString('WAVE'); // Format
writeString('fmt '); // Subchunk1ID
writeUint32(16); // Subchunk1Size
writeUint16(format); // AudioFormat
writeUint16(numChannels); // NumChannels
writeUint32(sampleRate); // SampleRate
writeUint32(byteRate); // ByteRate
writeUint16(blockAlign); // BlockAlign
writeUint16(bytesPerSample * 8); // BitsPerSample
writeString('data'); // Subchunk2ID
writeUint32(dataSize); // Subchunk2Size
return buffer;
}
body {
text-align: center;
padding-top: 1rem;
}
[hidden] {
display: none;
}
audio {
display: inline-block;
}
<div id="loading">Loading...</div>
<audio hidden controls></audio>
Upvotes: 6