Reputation: 181
I am trying to write a discriminator that evaluates patches of an image. Therefore I generate 32x32 non-overlapping patches from the input and then concatenate them on a new axis.
The reason I am using a time-distributed layer is that at the end, the discriminator should evaluate the whole image as true or fake. Thus, I am trying to perform a forward pass on each patch individually and then averaging the discriminator output across the patches by the lambda layer:
def my_average(x):
x = K.mean(x, axis=1)
return x
def my_average_shape(input_shape):
shape = list(input_shape)
del shape[1]
return tuple(shape)
def defineD(input_shape):
a = Input(shape=(256, 256, 1))
cropping_list = []
n_patches = 256/32
for x in range(256/32):
for y in range(256/32):
cropping_list += [
K.expand_dims(
Cropping2D((( x * 32, 256 - (x+1) * 32), ( y * 32, 256 - (y+1) * 32)))(a)
, axis=1)
]
x = Concatenate(1)(cropping_list)
x = TimeDistributed(Conv2D(4 * 8, 3, padding='same'))(x) #
x = TimeDistributed(MaxPooling2D())(x)
x = TimeDistributed(LeakyReLU())(x) # 16
x = TimeDistributed(Conv2D(4 * 16, 3, padding='same'))(x)
x = TimeDistributed(MaxPooling2D())(x)
x = TimeDistributed(LeakyReLU())(x) # 8
x = TimeDistributed(Conv2D(4 * 32, 3, padding='same'))(x)
x = TimeDistributed(MaxPooling2D())(x)
x = TimeDistributed(LeakyReLU())(x) # 4
x = TimeDistributed(Flatten())(x)
x = TimeDistributed(Dense(2, activation='sigmoid'))(x)
x = Lambda(my_average, my_average_shape)(x)
return keras.models.Model(inputs=a, outputs=x)
For some reason I get the following error:
File "testing.py", line 41, in <module>
defineD((256,256,1) )
File "testing.py", line 38, in defineD
return keras.models.Model(inputs=a, outputs=x)
File "/usr/local/lib/python2.7/dist-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 93, in __init__
self._init_graph_network(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 237, in _init_graph_network
self.inputs, self.outputs)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1353, in _map_graph_network
tensor_index=tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1340, in build_map
node_index, tensor_index)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/network.py", line 1312, in build_map
node = layer._inbound_nodes[node_index]
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
Upvotes: 1
Views: 3611
Reputation: 549
I ran into the same issue and it solved indeed by wrapping a Lambda layer around the tensor as @today proposed.
Thanks for that hint, it pointed me in the right direction. I wanted to turn a vector into a diagonal matrix to
I wanted to concatenate a vector with a square image and by turning the vector in a diag matrix. It worked with the following snippet:
def diagonalize(vector):
diagonalized = tf.matrix_diag(vector) # make diagonal matrix from vector
out_singlechan = tf.expand_dims(diagonalized, -1) # append 1 channel to get compatible to the multichannel image dim
return out_singlechan
lstm_out = Lambda(diagonalize, output_shape=(self.img_shape[0],self.img_shape[1],1))(lstm_out)
Upvotes: 0
Reputation: 33410
You need to put your cropping operations in a function and then use that function in a Lambda
layer:
def my_cropping(a):
cropping_list = []
n_patches = 256/32
for x in range(256//32):
for y in range(256//32):
cropping_list += [
K.expand_dims(
Cropping2D((( x * 32, 256 - (x+1) * 32), ( y * 32, 256 - (y+1) * 32)))(a)
, axis=1)
]
return cropping_list
To use it:
cropping_list = Lambda(my_cropping)(a)
Upvotes: 3