Reputation: 21
Do i get this comments right? Are those the 5 layers of my model, as described below?
# input - conv - conv - linear - linear(fc)
def model(data): # input Layer
# 1 conv Layer
conv = tf.nn.conv2d(data, layer1_weights, [1, 2, 2, 1], padding='SAME')
hidden = tf.nn.relu(conv + layer1_biases) # Activation function
# 1 conv Layer
conv = tf.nn.conv2d(hidden, layer2_weights, [1, 2, 2, 1], padding='SAME')
hidden = tf.nn.relu(conv + layer2_biases) # Activation function
# not a layer ( just reshape)
shape = hidden.get_shape().as_list()
reshape = tf.reshape(hidden, [shape[0], shape[1] * shape[2] * shape[3]])
# 1 linear layer - not fc due to relu
hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
# 1 linear fully connected layer
return tf.matmul(hidden, layer4_weights) + layer4_biases
Upvotes: 1
Views: 276
Reputation: 990
Your comment labels are right but I think there is an issue with your code.
If you look at the definition of tf.nn.conv2d
:
conv2d(
input,
filter,
strides,
padding,
use_cudnn_on_gpu=True,
data_format='NHWC',
name=None
)
You see that the second argument is not weights but the filter (kernel) shape, defined as: [filter_height, filter_width, in_channels, out_channels]
You could use tf.layers.conv2d instead. It simplifies the code and does the weights, biases and activation in one line. e.g.
conv1 = conv2d(data, filters, kernel_size=[2, 2], padding='same', activation=tf.nn.relu)
Upvotes: 0
Reputation: 823
# 1 linear layer - not fc due to relu
hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
In this layer it is a fully connected layer and it is passed through a "RELU" Activation function. The layer of this code is this part
tf.matmul(reshape, layer3_weights) + layer3_biases
and you are sending this layer through a relu activation function
tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
Other then this everything seems fine.
Upvotes: 1