Reputation: 25
So my problem seems to be an easy one but I can't figure out the syntax for python tensorflow. I have a simple neural network with an input layer, one hidden layer and one output layer. The output layer consists of two neurons. So here is the problem: the first output neuron I want to keep linear, while the second output neuron should have an sigmoidal activation function. I found that there is no such thing as "sliced assignments" in tensorflow but I did not find any work-around.
Here an example snippet:
def multilayer_perceptron(x, weights, biases,act_fct):
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'], name='hidden_layer_op')
if (act_fct == 'sigmoid'):
layer_1 = tf.nn.sigmoid(layer_1)
print 'sigmoid'
elif (act_fct == 'relu'):
print 'relu'
layer_1 = tf.nn.relu(layer_1)
elif (act_fct == 'linear'):
print 'linear'
else :
print 'Unknown activation function'
sys.exit()
out_layer = tf.add(tf.matmul(layer_1, weights['out']), biases['out'], name='output_layer_op')
##DOES NOT WORK!
out_layer[1] = tf.nn.sigmoid(out_layer[1])
return out_layer
I am sure there is a very simple way to do this. However hopefully someone can help me with that. P.S. (all the variables passed to the function have been initialized accordingly beforehand)
best regards and thanks!
Upvotes: 0
Views: 3622
Reputation: 25
Thank you so much for your answer! This helped me to get to a working solution.
I have n_features input neurons connected to 20 hidden neurons. These 20 input neurons are then connected to the 2 output neurons.
So the shape of the layer_1 is (batch_size, 20) (or in fact (?,20)). Furthermore I encountered one slight problem with the tensorflow version regarding concat (also the axis= is not needed!). Regarding to your version one might have to write:
output = tf.concat(1,[output_1, output_2])
instead of
output = tf.concat([output_1, output_2],1)
Anyway for future reference here is the working code (initialization and connection):
Initialization:
weights = {
'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1]),name='w_hidden'),
'h2': tf.Variable(tf.random_normal([n_hidden_1, n_hidden_1]),name='w_hidden2'),
'out1': tf.Variable(tf.random_normal([n_hidden_1, 1]),name='w_out_1'),
'out2': tf.Variable(tf.random_normal([n_hidden_1, 1]),name='w_out_2')
}
biases = {
'b1': tf.Variable(tf.random_normal([n_hidden_1]),name='bias_hidden'),
'b2': tf.Variable(tf.random_normal([n_hidden_1]),name='bias_hidden2'),
'out1': tf.Variable(tf.random_normal([1]),name='bias_out1'),
'out2': tf.Variable(tf.random_normal([1]),name='bias_out2')
}
Connection:
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'], name='hidden_layer_op')
layer_1 = tf.nn.sigmoid(layer_1)
print 'sigmoid'
output_1 = tf.add(tf.matmul(layer_1[0, None, :], weights['out1']), biases['out1'], name='output_layer_op1')
output_2 = tf.add(tf.matmul( layer_1[1, None, :], weights['out2']), biases['out2'], name='output_layer_op1')
output_2 = tf.sigmoid(output_2)
out_layer = tf.concat(1,[output_1, output_2])
return out_layer
Thanks and best regards!
Upvotes: 2
Reputation: 11895
I assume that layer_1
is a tensor with shape (batch_size, 2)
. Here is one way to do it:
import tensorflow as tf
batch_size = 3
layer_1 = tf.ones((batch_size, 2))
output_1 = layer_1[:, None, 0]
output_2 = tf.sigmoid(layer_1[:, None, 1])
output = tf.concat([output_1, output_2], axis=-1)
with tf.Session() as sess:
print(sess.run(output))
Upvotes: 2