Reputation: 123
I am working on a small app that takes user audio and sends it over the back-end service. For this, I am using Angular.
So far I have created an app that takes user audio and sends it, I can also download the file and hear my self (for testing purposes).
Now, when user audio is too big let's say more than 10 sec, I would want to slice it up into chunks. I have the implemented code for it. I can get the chunks, pass them into the array and download them all. But when I try to play them I am only able to play the first audio chunk, for the rest I cannot even open them.
This is where I am slicing my blob into sub-blobs.
if (this.duration > this.audioLimitBeforeChunking) {
let numChunks = this.duration/this.audioLimitBeforeChunking;
Math.floor(numChunks/1000);
console.log('numChunks is: ', numChunks);
for (let index = 0; index <= numChunks; index++) {
var byteEnd = Math.ceil((size/numChunks) * (index + 1));
this.blobArray.push(blob.slice(this.byteIndex, byteEnd));
this.byteIndex += (byteEnd - this.byteIndex);
}
}
This is where I am downloading all the sub-blobs.
for (let i = 0; i <= this.blobArray.length; i++) {
this.url = URL.createObjectURL(this.blobArray[i]);
this.audioFile = new File([this.blobArray[i]], "I_like_apples" + i + ".wav");
let postBody = {
'file': this.audioFile,
'user_token': '*******',
'reference_text': '******',
'model_id': '******'
}
console.log("inside sendData() component");
this.httpService.getStatus(postBody).subscribe(
(response:any) => {
console.log(response)
}
)
const a = document.createElement('a');
a.style.display = 'none';
a.href = this.url;
a.download = 'test.wav';
document.body.appendChild(a);
a.click();
}
}
As I mentioned above, I can only play the first sub-blob for the rest I cannot even open the audio file. Any idea what is going on here? Am I missing something?
Upvotes: 2
Views: 663
Reputation: 18805
You can't just split the audio files into chunks like this and then try and use them in playback unless you include the .wav
RIFF audio format metadata within each of the blobs. This metadata allows the browser to interpret the file as an audio file and then playback the audio. That is why your first blob works, but the rest don't - because the first one includes the original metadata from the file.
Doing this from scratch is not exactly trivial.
The header of a WAV (RIFF) file is 44 bytes long and has the following format
Positions Sample Value Description 1 - 4 “RIFF” Marks the file as a riff file. Characters are each 1 byte long. 5 - 8 File size (integer) Size of the overall file - 8 bytes, in bytes (32-bit integer). Typically, you’d fill this in after creation. 9 -12 “WAVE” File Type Header. For our purposes, it always equals “WAVE”. 13-16 “fmt” Format chunk marker. Includes trailing null 17-20 16 Length of format data as listed above 21-22 1 Type of format (1 is PCM) - 2 byte integer 23-24 2 Number of Channels - 2 byte integer 25-28 44100 Sample Rate - 32 byte integer. Common values are 44100 (CD), 48000 (DAT). Sample Rate = Number of Samples per second, or Hertz. 29-32 176400 (Sample Rate * BitsPerSample * Channels) / 8. 33-34 4 (BitsPerSample * Channels) / 8.1 - 8 bit mono2 - 8 bit stereo/16 bit mono4 - 16 bit stereo 35-36 16 Bits per sample 37-40 “data” “data” chunk header. Marks the beginning of the data section. 41-44 File size (data) Size of the data section.
I would recommend using a library like https://github.com/rochars/wavefile to assist you in generating these blobs.
Alternatively you could possibly take the first 44 bytes from the original file format and prepend this to the start of all of your blobs - I don't know if this would actually work though.
Upvotes: 2