Reputation: 101
That is to say if I have a differentiable model g, and a differentiable function f (which could also include models).
with tf.GradientTape(persistent=True) as tape:
for_ in range(n):
r = g(r)
loss = f(r)
grad = tape.gradient(loss, g.trainable_variables)
would tape.gradient apply backpropagation through time for n steps on g?
Upvotes: 0
Views: 219
Reputation: 21
You can use matplotlib to create two graphs and see how they compare. The red line is for the target and the blue line is for the prediction result.
tf.GradientTape(
persistent=False, watch_accessed_variables=True
Upvotes: 2