Milad
Milad

Reputation: 471

tf.keras.layers.Dense output changes for the same input row

I'm using a dense layer in TF2 using graph mode. Input to the dense layer has shape batch_size * sen_len * embedding_size and the dense layer is defined as tf.keras.layers.Dense(units=feature_dim, activation='relu')

Output is batch_size * sen_len * units

My expectation was, each row of output only depends on the same row in the input. However, I see that the values of the first row changes when the input adds a new row (however the first row in not changing). In one of the cases: input:

[[[0.0571852736 0.0287841056 0.101935618 0.0874886662 1.05053329 0.98418349 0.969990492 0.945339322]
[0.847961783 0.140915036 0.0435606614 0.0663332716 0.540852189 1.11434972 0.913121104 0.955932319]
[0.981501162 0.268874854 0.0995861515 0.0536317527 -0.327278584 0.866326749 0.987028778 0.913541615]
[0.213323712 0.365725756 0.109582983 0.0546317473 -0.901124239 0.841596663 0.986778796 0.913539171]]]

output:

[[[0.578112364 0 0.658412695 0 0 0 0.261683643 0]
[0.310602546 0 0.123107374 0 0 0 0.483636916 0]
[0.275210589 0.00601896644 0 0 0 0.212952077 0.767679 0]
[0.45270589 0.806572795 0 0 0 0 0.787857771 0]]]

and after we add another row to the input, output changes: input:

[[[0.0571852736 0.0287841056 0.101935618 0.0874886662 1.05053329 0.98418349 0.969990492 0.945339322]
[0.847961783 0.140915036 0.0435606614 0.0663332716 0.540852189 1.11434972 0.913121104 0.955932319]
[0.981501162 0.268874854 0.0995861515 0.0536317527 -0.327278584 0.866326749 0.987028778 0.913541615]
[0.213323712 0.365725756 0.109582983 0.0546317473 -0.901124239 0.841596663 0.986778796 0.913539171]
[-0.215699628 1.06977916 0.238905698 0.0676310733 -0.868791223 -0.142939389 0.974456 0.91341567]]]

output:

[[[0.578112364 0 0.658412755 0 0 0 0.261683613 0]
[0.310602546 0 0.123107374 0 0 0 0.483636916 0]
[0.275210589 0.00601896644 0 0 0 0.212952077 0.767679 0]
[0.45270589 0.806572795 0 0 0 0 0.787857771 0]
[0.670830309 0.880182147 0 0.296360224 0 0 0.905243635 0]]]

As mentioned, output changed for the first row while the input for that row is the same.

Upvotes: 0

Views: 125

Answers (1)

ATIF ADIB
ATIF ADIB

Reputation: 589

The output of your model is the same in both the cases, if you want the values to match exactly every time then you would have to set precision for the output variables.

You can do this in two ways:

 # 1. Apply keras mixed precision policy on your network
 
     from tf.keras import mixed_precision
     policy = mixed_precision.Policy('float32')
     mixed_precision.set_global_policy(policy)

 # 2. Apply precision checks on your output tensor.
     
     output = model.predict(input)
     output = np.round(output, 2)

Upvotes: 1

Related Questions