user20988246
user20988246

Reputation: 1

Gradient descent self code: loss increasing gradually while optimizing -python

I'm trying to estimate the two parameter of the following exponential decay.
but the error(loss) is gradually increasing.

I tried smaller learning_rate, from 10 ^-2 to 10 ^-10
calculate differential again,
tried with different data set.

The parameters didn't bounce. Just steadily changing i.e. increasing or decreasing only.

enter image description here

enter image description here

exponential decay

code and the data is here: https://github.com/psmuler/temp.git

What is wrong with the code? If I change the minus of
tau - dif_tau/len(data), b - dif_b/len(data)
in the line 35 into plus(+), it worked. But surely this is not the solution.
Maybe wrong with the partial differentiation.

enter image description here or do I just misunderstand the very basis?

Thank you.

If I change the minus of
tau - dif_tau/len(data), b - dif_b/len(data)
in the line 35 into plus(+), it worked. But surely this is not the solution.

I've got tau = 1291.352909 b = 0.14934105 on the data set 1_7, which correspond quite well.

enter image description here

Upvotes: 0

Views: 59

Answers (1)

RyS
RyS

Reputation: 11

You forgot to put - before time.

def dif_f0_b(time, tau, b):
    return (-1)*math.exp(time/tau) + 1

must be

def dif_f0_b(time, tau, b):
    return (-1)*math.exp(-time/tau) + 1

Upvotes: 1

Related Questions