Kareem Amr
Kareem Amr

Reputation: 81

Euclidean distance in Keras

I came across some Keras code of a siamese network where two ndarrays each of size (?,128) get passed to a layer to get the difference between them, and then to a Lambda layer to get the squared sum of squares of the resulted array, the purpose of this is to get the euclidean distance between the two initial arrays

embedded_distance = layers.Subtract(name='subtract_embeddings')([encoded_1, encoded_2])

embedded_distance = layers.Lambda(lambda x: K.sqrt(K.sum(K.square(x), axis=-1, keepdims=True)), name='euclidean_distance')(embedded_distance)

what confuses me is that according to visual architecture of the model, the output size of that layer, as well as the input size of the following dense layer, is also of size (?,128), isn't it supposed to be just a number? Or else how does the sum method work?

Here is the link to the class if anyone is interested as well as the visual architecture. (note: this code is unchanged by me and it does work as I've trained a model with it)

http://codebin.herokuapp.com/?s=5e162c612cdd6f0004000001

https://i.sstatic.net/PNMjQ.jpg

Upvotes: 2

Views: 6203

Answers (2)

Minions
Minions

Reputation: 5477

import keras.backend as K


def euclidean_distance_loss(y_true, y_pred):
    """
    Euclidean distance loss
    https://en.wikipedia.org/wiki/Euclidean_distance
    :param y_true: TensorFlow/Theano tensor
    :param y_pred: TensorFlow/Theano tensor of the same shape as y_true
    :return: float
    """
    return K.sqrt(K.sum(K.square(y_pred - y_true), axis=-1))

Use it:

model.compile(loss=euclidean_distance_loss, optimizer='rmsprop')

resource: https://riptutorial.com/keras/example/32022/euclidean-distance-loss

Upvotes: 3

hola
hola

Reputation: 612

The problem you are reporting is related to the fact that you are using

distance_metric == 'weighted_l1'

You should change this line of code

embedded_distance = layers.Lambda(lambda x: K.abs(x))(embedded_distance)

To

embedded_distance = layers.Lambda(
        lambda x: K.sum(K.abs(x), axis=-1, keepdims=True), 
        name='euclidean_distance')(embedded_distance)

Note also that adding output = layers.Dense(1, activation='sigmoid')(embedded_distance) has no interest, because the input to shi layer is already a scalar in this case.

Upvotes: 1

Related Questions