Reputation: 2217
I am running a simple neural network for linear regression. However TensorFlow is complaining that my feed_dict
placeholder(s) are not an element of the graph. However my placeholders and my model are all defined within my graph as can be seen below:
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Dense
with tf.Graph().as_default():
x = tf.placeholder(dtype=tf.float32, shape = (None,4))
y = tf.placeholder(dtype=tf.float32, shape = (None,4))
model = tf.keras.Sequential([
Dense(units=4, activation=tf.nn.relu)
])
y = model(x)
loss = tf.reduce_mean(tf.square(y-x))
train_op = tf.train.AdamOptimizer().minimize(loss)
with tf.Session() as sess:
sess.run(train_op, feed_dict = {x:np.ones(dtype='float32', shape=(4)),
y:5*np.ones(dtype='float32', shape=(4,))})
This gives an error:
TypeError: Cannot interpret feed_dict key as Tensor: Tensor
Tensor("Placeholder:0", shape=(?, 4), dtype=float32) is not an element of this graph.
____________UPDATE________________
Following the advice from @Silgon and @Mcangus, I have modified the code:
g= tf.Graph()
with g.as_default():
x = tf.placeholder(dtype=tf.float32, shape = (None,4))
model = tf.keras.Sequential([
Dense(units=4, activation=tf.nn.relu)
])
y = model(x)
loss = tf.reduce_mean(tf.square(y-x))
train_op = tf.train.AdamOptimizer().minimize(loss)
init_op = tf.group(tf.global_variables_initializer(),
tf.local_variables_initializer())
with tf.Session(graph=g) as sess:
sess.run(init_op)
for i in range(5):
_ , answer = sess.run([train_op,loss], feed_dict = {x:np.ones(dtype='float32', shape=(1,4)),
y:5*np.ones(dtype='float32', shape=(1,4))})
print(answer)
However the model doesn't appear to be learning:
16.0
16.0
16.0
16.0
16.0
Upvotes: 0
Views: 191
Reputation: 7211
The error tells you that the variable is not an element of the graph. It might be because it's not in the same scope. One way to solve it is to have a structure like the following.
# define a graph
graph = tf.Graph()
with graph.as_default():
# placeholder
x = tf.placeholder(...)
y = tf.placeholder(...)
# create model
model = create_model(x, w, b)
with tf.Session(graph=graph) as sess:
# initialize all the variables
sess.run(init)
Also, as @Mcangus points out, be careful with the definition of your variables.
Upvotes: 2
Reputation: 1856
I believe your issue is this line:
y = model(x)
You overwrite y
with the output of your model so it's no longer a placeholder.
Upvotes: 1