Nena
Nena

Reputation: 21

Pytorch vs Numpy for second derivative evaluation: numerical issues

I am comparing second second derivative calculations using numpy and pytorch.

Here the code and results with numpy:

import numpy as np
import matplotlib.pyplot as plt

dt  = 1e-6
t   = np.arange(0,10,dt)

u_SZ   = np.cos(2*t)

# ---------------------
# --- NUMPY ---
# ---------------------
t_SZ_c    = t[1:-1]

u_SZ_t    = u_SZ[2:]
u_SZ_c    = u_SZ[1:-1]
u_SZ_b    = u_SZ[:-2]
u_t_SZ_t  = (u_SZ_t - u_SZ_c) / dt
u_t_SZ_b  = (u_SZ_c - u_SZ_b) / dt
u_t_SZ_c  = (u_t_SZ_t + u_t_SZ_b) / 2
u_tt_SZ_c = (u_t_SZ_t - u_t_SZ_b) / dt

ax = plt.subplot(131)
ax.plot(t_SZ_c, u_SZ_c)
ax = plt.subplot(132)
ax.plot(t_SZ_c, u_t_SZ_c)
ax = plt.subplot(133)
ax.plot(t_SZ_c, u_tt_SZ_c)
plt.show(block = True)

enter image description here

Which is expected.

The same but with torch:

dt = 1e-6
t_SZ  = torch.arange(0,10,dt).view(-1,1).requires_grad_(True)
u_SZ = torch.cos(2*t_SZ); 
t_SZ_c    = t_SZ[1:-1]

u_SZ_t    = u_SZ[2:]
u_SZ_c    = u_SZ[1:-1]
u_SZ_b    = u_SZ[:-2]
u_t_SZ_t  = (u_SZ_t - u_SZ_c) / dt
u_t_SZ_b  = (u_SZ_c - u_SZ_b) / dt
u_t_SZ_c  = (u_t_SZ_t + u_t_SZ_b) / 2
u_tt_SZ_c = (u_t_SZ_t - u_t_SZ_b) / dt

t_SZ_c = (t_SZ_c.detach().numpy())
u_SZ_c = (u_SZ_c.detach().numpy())
u_t_SZ_c = (u_t_SZ_c.detach().numpy())
u_tt_SZ_c =(u_tt_SZ_c.detach().numpy())


ax = plt.subplot(131)
ax.plot(t_SZ_c, u_SZ_c)
ax = plt.subplot(132)
ax.plot(t_SZ_c, u_t_SZ_c)
ax = plt.subplot(133)
ax.plot(t_SZ_c, u_tt_SZ_c)
plt.show(block = True)

enter image description here

The second derivative literally explodes: if I zoom on the first derivative I'll notice a lot of local oscillations, that on the first example does not exists.

Can somebody explain me why this happens?

I tried to perform with .autograd() of pytorch and I do not encounter the issue. However, I would like to understand why or what is the code for such behaviour.

Upvotes: 2

Views: 66

Answers (0)

Related Questions