Reputation: 693
The Tensorflow documentation states that a Variable
can be used any place a Tensor
can be used, and they seem to be fairly interchangeable. For example, if v
is a Variable
, then x = 1.0 + v
becomes a Tensor
.
What is the difference between the two, and when would I use one over the other?
Upvotes: 12
Views: 4218
Reputation: 929
It's true that a Variable can be used any place a Tensor can, but the key differences between the two are that a Variable maintains its state across multiple calls to run() and a variable's value can be updated by backpropagation (it can also be saved, restored etc as per the documentation).
These differences mean that you should think of a variable as representing your model's trainable parameters (for example, the weights and biases of a neural network), while you can think of a Tensor as representing the data being fed into your model and the intermediate representations of that data as it passes through your model.
Upvotes: 16