Reputation: 1678
I would like to implement an AudioWorkletProcessor that is aware of time. For example: how to reimplement the DelayNode as a Processor?
The MDN docs says:
By specification, each block of audio your process() function receives contains 128 frames (that is, 128 samples for each channel), but it is planned that this value will change in the future, and may in fact vary depending on circumstances, so you should always check the array's length rather than assuming a particular size.
I can get the number of frames with the inputs' length, but how to get the sample rate used? So that I can tell how long (in seconds) is this input.
My end goal is to be able to compute the average energy of a signal over a certain time window.
class EnergyProcessor extends AudioWorkletProcessor {
process(inputs, outputs, parameters) {
if (inputs.length !== 1) {
throw 'invalid inputs'
}
// how much time is covered by inputs?
inputs[0].forEach((channel, channelID) => {
let sum = 0
let count = 0
channel.forEach((value, i) => {
sum += value * value
count += 1
for (let o = 0; o < outputs.length; o++) {
// skip when writing x channels to x - 1
if (channelID >= outputs[o].length) {
continue
}
outputs[o][channelID][i] = sum / count
}
})
})
return true
}
}
registerProcessor('EnergyProcessor', EnergyProcessor)
Upvotes: 3
Views: 1634
Reputation: 168966
The MDN says that
[...] lives in the AudioWorkletGlobalScope and runs on the Web Audio rendering thread.
and AudioWorkletGlobalScope
is said to have a reference to its context's
sampleRate
: Read only
Returns a float that represents the sample rate of the associated BaseAudioContext.
so you probably can simply, magically
console.log(sampleRate)
or whatever you need to do.
Upvotes: 10