Reputation: 51
How does Tensorflow handle None in tf.placeholder shape ? Is a new graph with different size input gets created for each run ? Is there any performance penalty in training/testing as a result ?
x = tf.placeholder(tf.float32, shape=[None, None, None, None])
lten = list of various shapes of rank 4 tensors.
for ten in lten:
feed_dict = {x: ten }
y = sess.run(y, feed_dict= feed_dict)
The concrete values for None could be different in the above as you can see.
Upvotes: 0
Views: 648
Reputation: 4460
Background: TensorFlow propagates shapes information across the entire graph given the incoming shape during the graph-construction phase. The code therefore looks like:
.SetShapeFn([](shape_inference::InferenceContext* c) {
c->set_output(0, c->input(0)); // output-shape <- input-shape
return Status::OK();
});
So each graph operation can infer its output shapes. Though, there are some ops, which do not know their output shape:
.SetShapeFn(tensorflow::shape_inference::UnknownShape);
This behaves like your placeholder. It's fine.
Answer: And there are no performance penalties, as the dimensions are extracted during each and every compute call:
// Set all the elements of the output tensor to 0
const int N = input.size(); // EXTRACTED EVERYTIME WHEN CALLED
for (int i = 0; i < N; i++) {
output(i) = 0;
}
So there is only one graph.
Still, providing the shape information in the placeholder is helpful for yourself and should be considered as best practice when possible. They can ensure your graph being correct. Think of using the output of an operation in two different ways, which are contradictory, e.g. same dense-layer applied to two different image sizes. You will notice this issue directly during the construction of the graph. In addition, the raised error message from TensorFlow is likely to be more comprehensible. It is then a kind of compile-time error rather than a run-time error.
Sidenote: Some potential future graph operations may need to know the exact shape beforehand. (But I haven't found a single one in the source, yet.)
Upvotes: 1