Nima
Nima

Reputation: 433

Caffe custom python layer for "accuracy"

I am trying to make my own custom python layer for calculating the network accuracy (being used in Phase: TEST).

My question: should it still have all these 4 functions:

if yes, why? I only want to use it in the TEST phase and for calculating the accuracy, not in the learning (Forward and Backward seems are for training).

Thanks everyone!

Upvotes: 2

Views: 444

Answers (1)

rafaspadilha
rafaspadilha

Reputation: 629

Although I'm not sure Caffe might output an error if you don't define all those four methods, you will definitely need Setup and Forward:

  • Setup: exactly what you've said. For example, in my Accuracy layers, I usually save some metrics (true and false positives/negatives, f-scores) for my whole testing set and the softmax probabilities of each sample, in case I want to combine/fuse different networks/methods later. Here is where I open the file where I'll write those information;
  • Forward: here is where you will calculate your accuracy per se, comparing the prediction with the label for each sample in your batch. Usually this layer will have two inputs, the label (ground truth that will be probably feed by the data/input layer) and a layer that outputs the prediction/scores/probabilities of each sample in your batch per class (I usually use a SoftMax Layer);
  • Reshape and Backward: don't worry about these. You won't need to worry about the backward pass nor reshape your blob.

Here is an example of an accuracy layer:

# Remark: This class is designed for a binary problem with classes '0' and '1'
# Saving this file as accuracyLayer.py

import caffe
TRAIN = 0
TEST = 1

class Accuracy_Layer(caffe.Layer):
    #Setup method
    def setup(self, bottom, top):
        #We want two bottom blobs, the labels and the predictions
        if len(bottom) != 2:
            raise Exception("Wrong number of bottom blobs (prediction and label)") 

        #Initialize some attributes
        self.correctPredictions = 0.0
        self.totalImgs = 0

    #Forward method
    def forward(self, bottom, top):
        #The order of these depends on the prototxt definition
        predictions = bottom[0].data
        labels = bottom[1].data

        self.totalImgs += len(labels)

        for i in range(len(labels)): #len(labels) is equal to the batch size
                pred = predictions[i]   #pred is a tuple with the normalized probability 
                                        #of a sample i.r.t. two classes
                lab = labels[i]

                if pred[0] > pred[1]:   #this means it was predicted as class 0
                        if lab == 0.0:
                                self.correctPredictions += 1.0

                else:                  #else, predicted as class 1
                        if lab == 1.0:
                                self.correctPredictions += 1.0

        acc = correctPredictions / self.totalImgs

       #output data to top blob
       top[0].data = acc

    def reshape(self, bottom, top):
        """
        We don't need to reshape or instantiate anything that is input-size sensitive
        """
        pass

    def backward(self, bottom, top):
        """
        This layer does not back propagate
        """
        pass

As well as how you would define it in your prototxt. Here is where you will say to Caffe that this layer will only be present during the TEST phase:

layer {
  name: "metrics"
  type: "Python"
  top: "Acc"
  top: "FPR"
  top: "FNR"

  bottom: "prediction"   #let's suppose we have these two bottom blobs
  bottom: "label"

  python_param {
    module: "accuracyLayer"
    layer: "Accuracy_Layer"
  }
  include {
    phase: TEST.    #This will ensure it will only be executed in TEST phase
  }
}

BTW, I've written a gist with a little more complex example of accuracy python layer that might be what you are looking for.

Upvotes: 1

Related Questions