Reputation: 1465
I work on a class to create all sorts of symmetric AE's. I now port this class to TF 2.0 and it is more complicated than I thought. However, I use subclassing of layers and models to achieve this. Therefore I want to group several keras layers to one keras layer. But if I want to write something like this:
def __init__(self, name, keras_layer, **kwargs):
self.keras_layer = tf.keras.layer.Conv2D
super(CoderLayer, self).__init__(name=name, **kwargs)
I get the following error, because tf wants to use this uninitialized layer:
TypeError: _method_wrapper() missing 1 required positional argument: 'self'
I also tried to wrap this in a list, but it did not work either.
EDIT
Here is a working minimal example and the full traceback:
import tensorflow as tf
print(tf.__version__) # 2.0.0-alpha0
class CoderLayer(tf.keras.layers.Layer):
def __init__(self, name, keras_layer):
self.keras_layer = keras_layer
self.out = keras_layer(12, [3, 3])
super(CoderLayer, self).__init__(name=name)
def call(self, inputs):
return self.out(inputs)
inputs = tf.keras.Input(shape=(200, 200, 3), batch_size=12)
layer = CoderLayer("minimal_example", tf.keras.layers.Conv2D)
layer(inputs)
Traceback:
Traceback (most recent call last):
File "..\baseline_cae.py", line 24, in <module>
layer(inputs)
File "..\AppData\Local\Continuum\anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 581, in __call__
self._clear_losses()
File "..\AppData\Local\Continuum\anaconda3\lib\site-packages\tensorflow\python\training\tracking\base.py", line 456, in _method_wrapper
result = method(self, *args, **kwargs)
File "..\AppData\Local\Continuum\anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 818, in _clear_losses
layer._clear_losses()
TypeError: _method_wrapper() missing 1 required positional argument: 'self'
Upvotes: 1
Views: 484
Reputation: 8595
The problem is with setting not instantiated class as an attribute in a subclass of tf.keras.layers.Layer
. If you remove following line
self.keras_layer = keras_layer
the code would work:
import tensorflow as tf
class CoderLayer(tf.keras.layers.Layer):
def __init__(self, name, keras_layer):
super(CoderLayer, self).__init__(name=name)
self.out = keras_layer(12, [3, 3])
def call(self, inputs):
return self.out(inputs)
inputs = tf.keras.Input(shape=(200, 200, 3), batch_size=12)
layer = CoderLayer("minimal_example", tf.keras.layers.Conv2D)
print(layer(inputs))
# Tensor("minimal_example_3/conv2d_12/BiasAdd:0", shape=(12, 198, 198, 12), dtype=float32)
It is probably a bug. This is a similar issue that had been raised (if you put your not instantiated class into list and try to __setattr__()
you will get the same exception).
This could be possible workaround if you want to use multiple layers:
class CoderLayer(tf.keras.layers.Layer):
def __init__(self, name, layername):
super(CoderLayer, self).__init__(name=name)
self.layer = layername
self.out = tf.keras.layers.__dict__[layername](1, 2)
def call(self, inputs):
return self.out(inputs)
inputs = tf.random.normal([1, 3, 3, 1])
layer = CoderLayer("mylayer", 'Conv2D')
layer(inputs).numpy()
Upvotes: 1