jwsmithers
jwsmithers

Reputation: 276

How to plot a ROC curve with Tensorflow and scikit-learn?

I'm trying to plot the ROC curve from a modified version of the CIFAR-10 example provided by tensorflow. It's now for 2 classes instead of 10.

The output of the network are called logits and take the form:

[[-2.57313061 2.57966399] [ 0.04221377 -0.04033273] [-1.42880082 1.43337202] [-2.7692945 2.78173304] [-2.48195744 2.49331546] [ 2.0941515 -2.10268974] [-3.51670194 3.53267646] [-2.74760485 2.75617766] ...]

First of all, what do these logits actually represent? The final layer in the network is a "softmax linear" of form WX+b.

The model is able to calculate accuracy by calling

top_k_op = tf.nn.in_top_k(logits, labels, 1)

Then once the graph has been initialized:

predictions = sess.run([top_k_op])
predictions_int = np.array(predictions).astype(int)
true_count += np.sum(predictions) 
...
precision = true_count / total_sample_count

This works fine.

But now how can I plot a ROC curve from this?

I've been trying the "sklearn.metrics.roc_curve()" function (http://scikit-learn.org/stable/modules/generated/sklearn.metrics.roc_curve.html#sklearn.metrics.roc_curve) but I don't know what to use as my "y_score" parameter.

Any help would be appreciated!

Upvotes: 7

Views: 20788

Answers (2)

rickymf4
rickymf4

Reputation: 37

import tensorflow as tf
tp = [] # the true positive rate list
fp = [] # the false positive rate list
total = len(fp)
writer = tf.train.SummaryWriter("/tmp/tensorboard_roc")
for idx in range(total):
    summt = tf.Summary()
    summt.value.add(tag="roc", simple_value = tp[idx])
    writer.add_summary (summt, tp[idx] * 100) #act as global_step
    writer.flush ()

then start a tensorboard:

tensorboard --logdir=/tmp/tensorboard_roc

tensorboard_roc

for details and code, you can visit my blog: http://blog.csdn.net/mao_feng/article/details/54731098

Upvotes: 1

fin
fin

Reputation: 156

'y_score' here should be an array corresponding to the probability of each sample that will be classified as positive (if positive was labeled as 1 in your y_true array)

Actually, if your network use Softmax as the last layer, then the model should output the probability of each category for this instance. But the data you given here doesn't conform with this format. And I checked the example code : https://github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/models/image/cifar10/cifar10.py it seems use the layer called softmax_linear, I know little for this Example but I guess you should process the output with something like Logistic Function to turn it into the probability.

Then just feed it along with your true label 'y_true' to the scikit-learn function:

y_score = np.array(output)[:,1]
roc_curve(y_true, y_score)

Upvotes: 1

Related Questions