Robotbugs
Robotbugs

Reputation: 4387

Implementing audio delay effects unit

I am creating an audio delay unit. Essentially the incoming samples go into a circular buffer and get picked out from some read pointer that is a number of samples behind the write pointer. These pointers are incremented by one for each new sample.

Additionally, in order to allow for fractional delays I actually have two read pointers one sample apart and use bilinear interpolation to blend between them depending on the floating point delay parameter. (I could use sinc interpolation or something else but have not bothered with that yet.)

It all works fine when the delay is set to a particular value. But when the user varies the delay while sounds are playing, a crackling noise is also apparent due to the changing delay taps. Presumably it is picking off the signal at varying samples and introducing random step discontinuities in the audio waveform.

I was wondering if there are any DSP audio buffs out there who know how to work around this problem, because I know that I have played with delay boxes where this effect does not happen, but at present I can't think of a solution.

Upvotes: 4

Views: 1736

Answers (4)

Dzintars Licis
Dzintars Licis

Reputation: 21

The solutiont that first came into my head was - because audio is always some sort of sine wave going up and down rapidly, you could always find a sample with value close to your previous value somewhere nearby. Thus increasing or decreasing the delay should land at the sample close enough by value somewhere in vicinity of desired sample count, but not strictly on the one requested, if it is largely off by value. And then it could gradually slide into place by automated finishing process, which increases or decreases the delay sample by sample, instead of large jumps, until desired (commanded) value is finally reached. If we are talking in terms of sample rate - even landing some 10 to 50 samples off could be decreased to 0 in some 50 to 250 samples, or 1-5ms at typical 48k. I guess that's quickly enough to be little perceivable. I don't know how it could sound like, nor I have hardware to test it off, but as an idea I guess it has a plausible reason to live. Drop me a line if anyone tries it out and finds being any good.

Upvotes: 0

Shannon Matthews
Shannon Matthews

Reputation: 10368

When the user varies the delay while sounds are playing, a crackling noise is also apparent due to the changing delay taps. Presumably it is picking off the signal at varying samples and introducing random step discontinuities in the audio waveform.

Yep. That's exactly right. I think the most natural way around this is to smooth the change in delay time. For example, instead of jumping from 200 milliseconds delay to 500 milliseconds, your delay effect should smoothly change from 200ms to 500ms over some period of time. With this technique the delayed audio will will drop or rise depending on how quickly the delay time is changed.

This technique can introduce aliasing (as Bjorn Roche indicates in the comments). If that is a problem one solution is to use bandlimited interpolation.

Upvotes: 0

Luis Mendo
Luis Mendo

Reputation: 112689

I don't know how DSP units do it, but I suggest you apply a lowpass filter to the delay set by the user in order to smoooth out any abrupt changes.

Let set_delay[n] represent the delay set by the user at time sample n, and let filtered_delay[n] be the filtered delay. The simplest lowpass filter is a one-pole IIR filter:

filtered_delay[n] = (1-alpha)*filtered_delay[n-1] + alpha*set_delay[n]

The value alpha controls the time constant of the filter. It should be chosen as a tradeoff between responsiveness and smoothness.

For example, if the user suddenly changes the set_delay from 100 to 50 samples, the filtered_delay with alpha=0.1 will be: 95, 90.5, 86.45, ..., thus slowly decreasing from 100 to 50.

Upvotes: 0

Brad
Brad

Reputation: 163262

When you're changing the delay, there will always be some sort of distortion. It's just a matter of picking what you want.

As you've found, if you just drop into a random spot, the sharp jump from one sample value to the next will often cause an audible pop. One option is to simply mute the audio for a small period of time and start it again. If this is still too abrupt, you could scale the values down to zero over a few milliseconds, and scale them back up on the new buffer position over a few milliseconds. (Effectively turning the volume down quickly, and then back up once you're at the new position.)

Propellerhead's Reason actually simulates speeding up and slowing down the recording as if it were a tape delay, and you were moving the head. This is pretty complicated... you are effectively re-sampling your buffer audio dynamically until you get to the new buffer location.

Upvotes: 4

Related Questions