js_55
js_55

Reputation: 227

Logits from Multi-Input Model arch. to single Tensorflow sigmoid from entropy loss function

I have implemented the vast majority of a multi-input model, and I am struggling in the last few steps of defining a single loss function for 2 incoming logit tensors.

Lets say I have the following:

logits_cnn =  tf.layers.dense(input1) # shape is [batch_size, num_classes] (e.g. [64,1])
logits_lstm = tf.layers.dense(input2) # shape is [batch_size, num_classes] (e.g. [64,1])

Then I want to feed them to the following loss function:

tf.losses.sigmoid_cross_entropy(multi_class_labels, logits, ...)

I was thinking of something like this:

logits_concat = tf.concat(values = [logits_cnn, logits_lstm], axis= -1, name='concat_logits')
loss = tf.losses.sigmoid_cross_entropy(multi_class_labels= y, logits = logits_concat)

Does the loss implementation make sense? I feel that this is not a conceptually correct approach. As an example, if y is shape [64,1] and logits_concat is shape [64,2] , then I'm suggesting that there's 2 classes in play and that's not my intent.

Upvotes: 0

Views: 31

Answers (0)

Related Questions