user554481
user554481

Reputation: 2075

Extreme memory usage by TensorFlow and Numpy

I'm training a convoluted neural network, and my code behaves very differently depending on whether I comment out this single line:

print("validation accuracy %g" % accuracy.eval(feed_dict={x: validation_predictors, y_: validation_targets, keep_prob: 1.0}))

When this line is commented out, TensorFlow is very smooth and consumes about 8 GB. When this line is not commented out, TensorFlow consumes all of the memory on my entire system, and SIGKILL terminates the process with a 137 return code (out of memory error). My train dataset is 32620 records, and my validation dataset (the one that leads to the error) is only 5292 records.

I'm using TensorFlow (0.9.0) and numpy (1.11.1) with Python 3.4.4. I have a 2.5 GB dataset. I'm running OSX Yosemite 10.10, and I've got a 16 GB of ram on my machine.

Why does this tiny validation dataset blowing up my machine and what is wrong with my code?

Upvotes: 1

Views: 710

Answers (1)

norman_h
norman_h

Reputation: 921

Try something like this:

for i in xrange(10):
    testSet = mnist.test.next_batch(500)
    print("test accuracy %g" % accuracy.eval(feed_dict={x: testSet[0], y_: testSet[1], keep_prob: 1.0}))

Upvotes: 3

Related Questions