PandaBearSoup
PandaBearSoup

Reputation: 699

Caffe compute gradient with respect to input using custom cost function

I have a pretrained caffe model with no loss layers. I want to do the following steps:

  1. Compute the cost/grad of some layer in the net.
  2. Backpropagate to compute the gradient with respect to the input layer.
  3. Perform gradient descent repeating 1 and 2 to optimize input.

I can not figure out how to add a loss layer to a pretrained model to do this. In other NN frameworks you can call a backward() function and pass a cost function. Is there any way to do this in caffe?

Upvotes: 1

Views: 399

Answers (1)

Anoop K. Prabhu
Anoop K. Prabhu

Reputation: 5645

You can create a custom layer in caffe for your cost function. Make a call to this cost function in the .prototxt file. You can finetune a pretrained model using your new cost function.

Finetuning is done using the below format of coomandline code:

./build/tools/caffe train --solver theAboveMentioned.prototxt --weights thePreTrainedWeightsFile

More on caffe finetuning can be found here.

Upvotes: -1

Related Questions