Reputation: 433
I am trying to make my own custom python layer for calculating the network accuracy (being used in Phase: TEST).
My question: should it still have all these 4 functions:
Forward - What would be input and output of a layer
Backward - Given the prediction and gradients from the next layer, compute the gradients for the previous layer
Reshape - Reshape your blob if needed
if yes, why? I only want to use it in the TEST phase and for calculating the accuracy, not in the learning (Forward and Backward seems are for training).
Thanks everyone!
Upvotes: 2
Views: 444
Reputation: 629
Although I'm not sure Caffe might output an error if you don't define all those four methods, you will definitely need Setup and Forward:
Here is an example of an accuracy layer:
# Remark: This class is designed for a binary problem with classes '0' and '1'
# Saving this file as accuracyLayer.py
import caffe
TRAIN = 0
TEST = 1
class Accuracy_Layer(caffe.Layer):
#Setup method
def setup(self, bottom, top):
#We want two bottom blobs, the labels and the predictions
if len(bottom) != 2:
raise Exception("Wrong number of bottom blobs (prediction and label)")
#Initialize some attributes
self.correctPredictions = 0.0
self.totalImgs = 0
#Forward method
def forward(self, bottom, top):
#The order of these depends on the prototxt definition
predictions = bottom[0].data
labels = bottom[1].data
self.totalImgs += len(labels)
for i in range(len(labels)): #len(labels) is equal to the batch size
pred = predictions[i] #pred is a tuple with the normalized probability
#of a sample i.r.t. two classes
lab = labels[i]
if pred[0] > pred[1]: #this means it was predicted as class 0
if lab == 0.0:
self.correctPredictions += 1.0
else: #else, predicted as class 1
if lab == 1.0:
self.correctPredictions += 1.0
acc = correctPredictions / self.totalImgs
#output data to top blob
top[0].data = acc
def reshape(self, bottom, top):
"""
We don't need to reshape or instantiate anything that is input-size sensitive
"""
pass
def backward(self, bottom, top):
"""
This layer does not back propagate
"""
pass
As well as how you would define it in your prototxt. Here is where you will say to Caffe that this layer will only be present during the TEST phase:
layer {
name: "metrics"
type: "Python"
top: "Acc"
top: "FPR"
top: "FNR"
bottom: "prediction" #let's suppose we have these two bottom blobs
bottom: "label"
python_param {
module: "accuracyLayer"
layer: "Accuracy_Layer"
}
include {
phase: TEST. #This will ensure it will only be executed in TEST phase
}
}
BTW, I've written a gist with a little more complex example of accuracy python layer that might be what you are looking for.
Upvotes: 1