Reputation: 126
I have 2 keras models. The first gets as input a string and gives a prediction for example, five classes.
In the second model I want to use this output. However, the output of the first model should be summed up into a single output for multiple inputs.
I want single prediction for the sum of all entered strings and not a prediction for each entered string.
model1 = tf.keras.Sequential()
model1.add(Input(shape=(len(inputs[0]),), dtype=tf.float32))
model1.add(Dense(256, activation='relu'))
model1.add(Dense(len(helper_classes), activation='softmax'))
model2 = tf.keras.Sequential()
model2.add(model1)
model2.add(Dense(16))
model2.add(Dense(len(classes), activation=tf.nn.softmax))
model2.layers[0].trainable = False
model2.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['accuracy'])
model2.summary()
For explanation: the strings are preprocessed to a float vector.
Actual output of model1:
Input: "Hello","World", ...
Output: [0.2, 0, 0, 0.8, 0],[0, 0, 0.4, 0, 0.6], ...
What i need:
Input: "Hello","World", ...
Output: [0.2 + 0.0 + ... , 0 + 0.0 + ... , 0 + 0.4 + ... , 0.8 + 0.0 + ... , 0 + 0.6 + ...]
Image of model1
Image of model1 after adding Reduction Layer
Solution
Okay I solved it now. My first mistake was that I summed up on axis 1. What I could fix with the help of vlad.
The second mistake was that I did not keep the dimensions with keep_dims = true.
The solution was to insert a lambda layer in the second model which basically does what Vlad and Thibault proposed:
model2 = tf.keras.Sequential()
model2.add(model1)
model2.add(Lambda(lambda x: K.sum(x, axis=0,keepdims=True)))
model2.add(Dense(16))
model2.add(Dense(len(classes), activation=tf.nn.softmax))
model2.layers[0].trainable = False
model2.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['accuracy'])
Upvotes: 0
Views: 3009
Reputation: 8585
Use tf.reduce_sum()
:
import tensorflow as tf
output = tf.Variable([[0.2, 0.0, 0.0, 0.8, 0],[0.0, 0.0, 0.4, 0, 0.6],])
reduced = tf.reduce_sum(output, axis=0)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(reduced.eval())
# [0.2 0. 0.4 0.8 0.6]
To use it within Keras
define a custom layer like this:
from tensorflow.keras import layers
class ReductionLayer(layers.Layer):
def __init__(self):
super(ReductionLayer, self).__init__()
def call(self, inputs):
return tf.reduce_sum(inputs, axis=0)
and add it to your Sequential()
model:
model.add(ReductionLayer())
Upvotes: 2
Reputation: 2331
If i understand well your problem, all you need is to sum your last Dense layer of model1. You can achieve that by unsing Keras backend Sum :
keras.backend.sum(x, axis=None, keepdims=False)
You can found doc here : https://keras.io/backend/#sum
Upvotes: 1