Reputation: 406
PS: Keras version is 2.4.3
The function below build the VGG16 neural network without the fully connected layers, because I want to get the feature maps only.
from keras.models import Model
from keras.layers import Conv2D, MaxPooling2D, Input
import keras.backend as K
import tensorflow as tf
def VGG16(input_tensor=None):
input_shape = (None, None, 3)
if input_tensor == None:
input_tensor = Input(shape=input_shape)
elif not K.is_keras_tensor(input_tensor):
input_tensor = Input(tensor=input_tensor, shape=input_shape)
vgg = Conv2D(64, (3, 3), padding='same', activation='relu', name='b1c1')(input_tensor)
vgg = Conv2D(64, (3, 3), padding='same', activation='relu', name='b1c2')(vgg)
vgg = MaxPooling2D((2, 2), strides=(2, 2), name='b1m')(vgg)
vgg = Conv2D(128, (3, 3), padding='same', activation='relu', name='b2c1')(vgg)
vgg = Conv2D(128, (3, 3), padding='same', activation='relu', name='b2c2')(vgg)
vgg = MaxPooling2D((2, 2), strides=(2, 2), name='b2m')(vgg)
vgg = Conv2D(256, (3, 3), padding='same', activation='relu', name='b3c1')(vgg)
vgg = Conv2D(256, (3, 3), padding='same', activation='relu', name='b3c2')(vgg)
vgg = MaxPooling2D((2, 2), strides=(2, 2), name='b3m')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b4c1')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b4c2')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b4c3')(vgg)
vgg = MaxPooling2D((2, 2), strides=(2, 2), name='b4m')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b5c1')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b5c2')(vgg)
vgg = Conv2D(512, (3, 3), padding='same', activation='relu', name='b5c3')(vgg)
vgg = MaxPooling2D((2, 2), strides=(2, 2), name='b5m')(vgg)
return vgg
This function receives VGG16 as base layers and connects them with first Conv2D layer rpn_conv
. This last one is connected to rpn_cls
and rpn_reg
def rpn(base_layers):
rpn_conv = Conv2D(512, (3, 3), padding='same', kernel_initializer='normal', activation='relu', name='rpn_conv')(base_layers)
rpn_cls = Conv2D(9, (1, 1), kernel_initializer='uniform', activation='sigmoid', name='rpn_cls')(rpn_conv)
rpn_reg = Conv2D(4*9, (1, 1), kernel_initializer='zero', activation='linear', name='rpn_reg')(rpn_conv)
return [rpn_cls, rpn_reg, rpn_conv]
I run the two functions and store the result in bnn
(as base layers), and rpn
:
bnn = VGG16(Input(shape=(None, None, 3)))
rpn = rpn(bnn)
And then I build the model with the Keras Model
class:
model = Model(inputs=Input(shape=(None, None, 3)), outputs=rpn[:2])
And I get this Error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-43-8c789d1b6abf> in <module>()
2 rpn = rpn(bnn)
3
----> 4 model = Model(inputs=Input(shape=(None, None, 512)), outputs=rpn[:2])
5 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/functional.py in _map_graph_network(inputs, outputs)
929 'The following previous layers '
930 'were accessed without issue: ' +
--> 931 str(layers_with_complete_input))
932 for x in nest.flatten(node.outputs):
933 computable_tensors.add(id(x))
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_19:0", shape=(None, None, None, 3), dtype=float32) at layer "b1c1". The following previous layers were accessed without issue: []
Upvotes: 1
Views: 172
Reputation: 346
When you are building your model with model = Model(inputs=Input(shape=(None, None, 3)), outputs=rpn[:2])
, you are creating a new Input
tensor. This disconnects the graph, you have to use the one you feed into VGG16()
. Try
input_tensor = Input(shape=(None, None, 3))
bnn = VGG16(input_tensor)
rpn = rpn(bnn)
model = Model(inputs=input_tensor, outputs=rpn[:2])
Upvotes: 4
Reputation: 6367
I think error is here:
model = Model(inputs=Input(shape=(None, None, 3)), outputs=rpn[:2])
You have to sent an exact variable as inputs parameter.
Upvotes: 1