JobHunter69
JobHunter69

Reputation: 2290

Why does not using retain_graph=True result in error?

If I need to backpropagate through a neural network twice and I don't use retain_graph=True, I get an error.

Why? I realize it is nice to keep the intermediate variables used for the first backpropagation to be reused for the second backpropagation. However, why aren't they simply recalculated, like they were originally calculated in the first backpropagation?

Upvotes: 0

Views: 91

Answers (1)

trsvchn
trsvchn

Reputation: 8991

By default, PyTorch doesn't store intermediate gradients, because the PyTorch's main feature is Dynamic Computational Graphs, so after backpropagation the graph will be freed all the intermediate buffers will be destroyed.

Upvotes: 0

Related Questions