user3377018
user3377018

Reputation: 93

What would be equivalent of pytorch's torch.nn.CosineEmbeddingLoss in tensorflow?

CosineEmbeddingLoss in Pytorch is the perfect function I am looking for in tensorflow, but I can only find tf.losses.cosine_distance. Is there a way or code that writes CosineEmbeddingLoss in tensorflow?

Upvotes: 2

Views: 2981

Answers (1)

Allen Lavoie
Allen Lavoie

Reputation: 5808

A TensorFlow version of CosineEmbeddingLoss:

import tensorflow as tf
from tensorflow import keras

cosine_similarity_loss = keras.losses.CosineSimilarity(
    reduction='none'
)

# target variable can be also passed along with margin
# it can be either target=1 or target = -1
# by this, CosineEmbeddingLoss methoc can be used 
# inside the `model.compile` with ease.
def CosineEmbeddingLoss(margin=0.):
    def cosine_embedding_loss_fn(input_one, input_two, target):
        similarity = - cosine_similarity_loss(input_one, input_two)
        return tf.reduce_mean(
            tf.where(
                tf.equal(target, 1),
                1. - similarity,
                tf.maximum(
                    tf.zeros_like(similarity), similarity - margin
                )
            )
        )
    return cosine_embedding_loss_fn

Running it alongside Torch's version:

import numpy as np 
import torch
from torch.autograd import Variable

first_values = numpy.random.normal(size=[100, 3])
second_values = numpy.random.normal(size=[100, 3])
labels = numpy.random.randint(2, size=[100]) * 2 - 1

torch_result = torch.nn.CosineEmbeddingLoss(margin=0.5)(
    Variable(torch.FloatTensor(first_values)),
    Variable(torch.FloatTensor(second_values)),
    Variable(torch.IntTensor(labels))
).data.numpy()

tf_result = CosineEmbeddingLoss(margin=0.5)(
    first_values, second_values, labels
).numpy()

print(torch_result, tf_result)

Seems to match to within reasonable precision:

0.58433354 0.5843335801639801

Upvotes: 6

Related Questions