zeekzhen
zeekzhen

Reputation: 159

Tensorflow- How to share CNN's filter weights for different input channel?

My 1-layer CNN neuron network's input data_set has 10 channels. If I set filter channel equal to 16, then there will be 10*16=160 filters.

I want to use same 16 filter channel's weights for each input channel. means only use 16 filters for my input data_set. means the 10 input channels share same convolution filter weights.

Dose any one know how to do this in tensorflow? thanks a lot.

Upvotes: 3

Views: 1246

Answers (1)

DomJack
DomJack

Reputation: 4183

You could use the lower level tf.nn.conv1d with a filters arg constructed by tiling the same single-channel filters.

f0 = tf.get_variable('filters', shape=(kernel_width, 1, filters_out), initializer=...)
f_tiled = tf.tile(f0, (1, filters_in, 1))
output = tf.nn.conv1d(input, f_tiled, ...)

However, you would get the same effect (and it would be much more efficient and less error prone) to simply add all your input channels together to form a single-channel input then use the higher-level layers API.

conv_input = tf.reduce_sum(input, axis=-1, keepdis=True))
output = tf.layers.conv1d(conv_input, filters=...)

Note unless all your channels are almost equivalent, this is probably a bad idea. If you want to reduce the number of free parameters, consider multiple convolutions - a 1x1 to reduce the number of filters, other convolutions with wide kernels and non-linearities, then a 1x1 convolution to get back to a large number of filters. The reduce_sum in the above implementation is effectively a 1x1 convolution with fixed weights of tf.ones, and unless your dataset is tiny you'll almost certainly get a better result from learning the weights followed by some non-linearity.

Upvotes: 3

Related Questions