0811张庆昊
0811张庆昊

Reputation: 548

TensorFlow: Why does I get different result even though I run this code twice at same place without updata any parameters?

I have built a CNN model in mnist_inference.py, and I want to compute the accuracy every 100 steps. But I found it dosen't work right. After long time debugging, I found the result was changing when I computed the value of y. At first, I considered it becouse of the parameters was auto-update when I computed y. But no! I found the parameters didn't change. So how do I compute the accuracy of my model? This is my code:mycode

Upvotes: 0

Views: 143

Answers (1)

dm0_
dm0_

Reputation: 2156

This line of your code

y = mnist_inference.inference(x, True, regularizer)

Creates model with dropouts:

def inference(input_tensor, train, regularizer):

    # code fragment

    with tf.variable_scope('layer5-fc1'):
        fc1_weights = tf.get_variable("weight", [nodes, FC_SIZE],
                                      initializer = tf.truncated_normal_initializer(stddev = 0.1))

        if regularizer != None:
            tf.add_to_collection('losses', regularizer(fc1_weights))
        fc1_biases = tf.get_variable('bias', [FC_SIZE], initializer = tf.constant_initializer(0.1))
        fc1 = tf.nn.relu(tf.matmul(reshaped, fc1_weights)+fc1_biases)

        # enables dropout!
        if train:
            fc1 = tf.nn.dropout(fc1, 0.5)

So you have dropouts enabled and that results in randomness you observe.

You need to disable dropout when computing accuracy. Higher level tf.layers.dropout has corresponding parameter (that can be a tensor).

Upvotes: 2

Related Questions