bigb055
bigb055

Reputation: 308

Calculating delay with exponential backoff

I'm writing my own retry logic with exponential backoff based on Microsoft's sample code on following page: https://learn.microsoft.com/en-us/dotnet/standard/microservices-architecture/implement-resilient-applications/explore-custom-http-call-retries-exponential-backoff

In the following line of code there is a division by 2 that I can't understand:

int delay = Math.Min(m_delayMilliseconds * (m_pow - 1) / 2,
        m_maxDelayMilliseconds);

Assume I defined int m_delayMilliseconds = 200, so we get following delays:

200 * 1 / 2 --> 100 ms

200 * 2 / 2 --> 200 ms

200 * 4 / 2 --> 400 ms

200 * 8 / 2 --> 800 ms

200 * 16 / 2 --> 1600 ms

. . . etc.

What disturbs me is that I get 100 ms for the first delay, but I want the minimum delay to be 200 ms, as defined. Can someone explain this to me?

Upvotes: 1

Views: 6650

Answers (1)

CasualCoder
CasualCoder

Reputation: 129

I think you swapped your intentions. If you wanted a max delay of 200ms and set m_maxDelayMilliseconds to 200ms then this code would work. However you say you want the min delay to be 200ms. So you should rename your variable to m_minDelayMilliseconds and change the function to Math.Max(). If you want both a max and min delay then you can use Math.Clamp().

Upvotes: 0

Related Questions