Reputation: 369
I am using tensorflow to train on cifar-10 dataset. My PC freezes when I run the training loop.
# forward propagation
# convolution layer 1
c1 = tf.nn.conv2d(x_train, w1, strides = [1,1,1,1], padding = 'SAME')
# activation function for c1: relu
r1 = tf.nn.relu(c1)
# maxpooling
p1 = tf.nn.max_pool(r1, ksize = [1,2,2,1], strides = [1,1,1,1], padding = 'SAME')
print('p1 shape: ',p1.shape)
# convolution layer 2
c2 = tf.nn.conv2d(p1, w2, strides = [1,1,1,1], padding='SAME')
# activation function for c2: relu
r2 = tf.nn.relu(c2)
# maxpooling
p2 = tf.nn.max_pool(r2, ksize = [1,2,2,1], strides = [1,2,2,1], padding = 'SAME')
print('p2 shape: ',p2.shape)
# fully connected layer
l1 = tf.contrib.layers.flatten(p2)
# fully connected layer
final = tf.contrib.layers.fully_connected(l1, 10, activation_fn = None)
print('output layer shape: ',final.shape)
I am using softmax cross entropy and adam optimizer:
# training and optimization
cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = final, labels = y_train))
# using adam optimizer
optimize = tf.train.AdamOptimizer(learning_rate).minimize(cross_entropy)
This is where it freezes:
# creating tensorflow session
se = tf.Session()
# initializing variables
se.run(tf.global_variables_initializer())
# training the graph
for i in range(1000):
x_batch, y_batch = mini_batch(x_train, y_train, 110)
se.run(optimize, {x: x_batch, y: y_batch})
cost = se.run(cross_entropy, {x: x_train, y: y_train})
print(cost)
Upvotes: 0
Views: 840
Reputation: 38
Well, it would have been great, if you would have also mentioned your PC configuration. Nevertheless, the programme you are running is not a computationally hard one or one which contains infinite loop, so in my opinion, the problem might arise from your PC, where you may be running a lot of applications, because of which your python daemon is not able to do sufficient allocation, hence the freezing/hanging problem occurs, it not necessarily a code related issue, given this code runs well and fine on my MacBook Pro 2012.
Upvotes: 1