Massyanya
Massyanya

Reputation: 2934

Tensorflow: How to sensibly merge two neural network layers into one

Assume a simple neural net with two inputs Tensorflow:

W0 = tf.Variable(tf.zeros([784, 100]))
b0 = tf.Variable(tf.zeros([100]))
h_a = tf.nn.relu(tf.matmul(x, W0) + b0)

W1 = tf.Variable(tf.zeros([100, 10]))
b1 = tf.Variable(tf.zeros([10]))
h_b = tf.nn.relu(tf.matmul(z, W1) + b1)

Question: What would be a good way to merge these two layers into one on the next layer?

I mean something like:

h_master = tf.nn.relu(tf.matmul(concat(h_a, h_b), W_master) + b_master)

However I can't seem to find a suitable function for this.


Edit: Please note: if I do this:

h_master = tf.nn.tanh(tf.matmul(np.concatenate((h_a,h_b)),W_master) + b_master),

I get the following error:

ValueError: zero-dimensional arrays cannot be concatenated

(My guess is that it happens because the placeholder is seen by the numpy as an empty array, hence h_a and h_b are zero-dimensional.)

Upvotes: 3

Views: 2307

Answers (1)

Massyanya
Massyanya

Reputation: 2934

I found a way:

h_master = tf.nn.tanh(tf.matmul(tf.concat((h_a, h_b), axis=1), W_master) + b_master)

where:

W_master = tf.Variable(tf.random_uniform([110, 10], -0.01, 0.01))
b_master = tf.Variable(tf.zeros([10]))

Upvotes: 2

Related Questions