Pranjal Sahu
Pranjal Sahu

Reputation: 1469

Feed input to intermediate layer and then do back propagation in keras

I have looked around everywhere but could not find the way to do this. Basically I want to feed input to some intermediate layer in a keras model and want to the backpropagation for the full graph (i.e. including layer before the intermediate layer). To understand this I refer you to the figure as mentioned in the paper "Multi-view Convolutional Neural Networks for 3D Shape Recognition".

enter image description here

From the figure you can see that the feature are maxpooled in view pooling layer and then the resultant vector is passed to the rest of the network. From the paper they further did he back propagation using the view pooling features.

To achieve this I am trying a simple approach. There will not be any viewpooling layer in my model. This pooling I will do offline by taking the features for multiple views and then taking the max of it. Finally the aggregated feature will be passed to rest of the network. However I am not able to figure out how to do the back propagation to the full network by passing input to intermediate layer directly.

Any help would be appreciated. Thanks

Upvotes: 2

Views: 948

Answers (1)

bremen_matt
bremen_matt

Reputation: 7349

If you have the code of the tensorflow model, then this will be quite simple. The model would probably look like

def model( cnns ):

    viewpool_output = f(cnns)  
    cnn2_output = cnn2( viewpool_output )
    ... 

You would just need to change the model to

def model( viewpool_output ):

    cnn2_output = cnn2( viewpool_output )
    ... 

and instead of passing a "real" view pool output, you just pass whatever image you want. But you haven't given any code, so we can only guess at what it looks like.

Upvotes: 2

Related Questions