Reputation: 413
What I'm trying to do now is to get a value out of 1 X 1 size tensor, and I have nearly 6000 of them.
I've tried using eval(), session() so far. The best I could think of was to change the tensor to numpy to get the value out of it. But the problem is that it's extremely slow, especially when having to deal with huge amount of data. Is there any fast way to retrieve the data from tensor?
Just for additional information, this is the part of my code where I'm trying to implement.
cross_IF = []
count = 0
for i in range(len(test_IF)):
if (count % 100 == 0):
print(count)
count += 1
c = keras.losses.categorical_crossentropy(test_IF[i], prediction_IF[i])
element = keras.backend.eval(tf.reduce_sum(c))
cross_IF.append(element)
cross_IF is the list that I'll use to stack up values from tensor 'tf.reduce_sum(c)'. test_IF and prediction_IF are test values and prediction values.
Upvotes: 3
Views: 455
Reputation:
Providing the resolution in Answer section for the benefit of community.
The issue was that using categorical_crossentropy
resulted in tensor
, not numpy
.
Converting categorical_crossentropy
to numpy
format and then appending that into numpy list took more time.
Instead, concatenating as a tensor
form for all the cross entropies
of the data and then converting that into numpy
at the end made it faster.
Upvotes: 1