Reputation: 319
When I initialize my Neural Net:
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
# Initializing the variables
sess.run(tf.global_variables_initializer())
# Training cycle
for epoch in range(epochs):
batch_i = 1
for batch_features, batch_labels in (input_data, input_labels):
train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
print('Epoch {:>2}, Batch {}: '.format(epoch + 1, batch_i), end='')
print_stats(sess, batch_features, batch_labels, cost, accuracy)
I get ValueError: too many values to unpack (expected 2)
on the For Loop.
I figured it might be because I didn't create batches, so I created:
tf.train.batch([input_data, input_labels], batch_size, num_threads=1, capacity=32)
But I get error:
TypeError: Cannot convert a list containing a tensor of dtype <dtype: 'uint8'> to <dtype: 'float32'> (Tensor is: <tf.Tensor 'stack_16495:0' shape=(280, 440, 3) dtype=uint8>)
Both input_data / input_labels is a list of tensors of arrays created using tf.stack
.
Upvotes: 0
Views: 1077
Reputation: 5070
ValueError:
caused by statement
for batch_features, batch_labels in (input_data, input_labels):
you needs
for batch_features, batch_labels in zip(input_data, input_labels):
instead. (input_data, input_labels)
results in tuple with two elements - input_data
and input_labels
. zip
creates list
of tuples
with elements from input_data
and input_labels
.
Small example with code
a = [1,2,3]
b = ['a', 'b', 'c']
c = (a, b)
d = zip(a, b)
c
Out[16]: ([1, 2, 3], ['a', 'b', 'c'])
d
Out[17]: [(1, 'a'), (2, 'b'), (3, 'c')]
Upvotes: 1