MichaelSB
MichaelSB

Reputation: 3181

Bidirectional RNN cells - shared or not?

Should I use the same weights to compute forward and backward passes in a bidirectional RNN, or should those weights be learned independently?

Upvotes: 5

Views: 1587

Answers (3)

thushv89
thushv89

Reputation: 11333

Personally, I haven't seen the same set of weights being used for both forward and backward pass that often, and I find using the same set of weights for both the forward and backward pass counter intuitive.

Because the idea of the Bidirectional RNNs is to have two hidden states for each input, giving information about what should be (or is) before the current input, and what should be (or is) after the current input. You can't have such two different states for an input if you use the same shared set of weights.

Upvotes: 2

nuric
nuric

Reputation: 11225

They should be learned independently as they learn different patterns, unless you have palindromes. In fact that is the default in the Bidirectional wrapper in Keras:

self.forward_layer = copy.copy(layer)
config = layer.get_config()
config['go_backwards'] = not config['go_backwards']
self.backward_layer = layer.__class__.from_config(config)

In the above source code the opposite direction is a copy with independent weights from the original direction.

Upvotes: 4

Mr.cysl
Mr.cysl

Reputation: 1614

It should be independently. See expected_hidden_size here.

Upvotes: 2

Related Questions