Fangzhou Zhai
Fangzhou Zhai

Reputation: 433

how to use tf operations in keras models

I am trying to us tensorflow operations within a keras model and I am quite confused about the mechanism and what Lambda layers do to tf tensors.

So this works:

a = keras.layers.Input(shape=[1, 2], dtype='float', name='a')
s= keras.layers.Lambda(lambda x: tf.transpose(tf.transpose(x)))(a)
model = keras.models.Model(inputs=a, outputs=s)

but this does not work:

a = keras.layers.Input(shape=[1, 2], dtype='float', name='a')
s = tf.transpose(tf.transpose(a))
s = keras.layers.Lambda(lambda x: x)(s)
model = keras.models.Model(inputs=a, outputs=s)

and it says:

AttributeError: 'Tensor' object has no attribute '_keras_history'

so is it always necessary to pack up tf operations within a layer?

Question 2 (was why I came up the previous one): do we have to pack with a custom layer to do matrix multiplication in keras?

thanks.

Upvotes: 4

Views: 2038

Answers (1)

kww
kww

Reputation: 549

Question 1: Yes, it is necessary to wrap tf operations with a layer, because keras models require certain functions/variables that aren't included with tensorflow ops. In this case, _keras_history is a property that is only produced by wrapping the op with a layer.

Question 2: Is the matrix multiplication traHave you considered using a keras Dense layer, with use_bias=False? If you want to use a constant for the weight vector, you could set the kernel_initializer={constant}, and trainable=False.

Upvotes: 4

Related Questions