Reputation: 569
I'm trying to use Keras' functional API to handle multiple inputs, and with a custom loss function RMSLE. Below is my code:
import tensorflow as tf
from tensorflow.keras.layers import *
from tensorflow.keras.models import Sequential, Model
from tensorflow.keras import backend as K
from tensorflow.keras.losses import MeanSquaredLogarithmicError
def rmsle(y_true, y_pred):
return K.sqrt(MeanSquaredLogarithmicError(y_true, y_pred))
def build_model():
i_language = Input(shape=(1,))
i_year = Input(shape=(1,))
i_abstract = Input(shape=(100,))
input = concatenate([i_language, i_year, i_abstract])
x = Dense(64)(input)
x = Dense(1, activation='softmax')(x)
model = Model(inputs=[i_language, i_year, i_abstract], outputs=x)
model.compile(optimizer = 'adam', loss = rmsle)
return model
model = build_model()
x1 = np.random.randint(3, size=(100, 1)).astype('float32')
x2 = np.random.randint(59, size=(100, 1)).astype('float32')
x3 = np.random.randn(100, 100)
y = np.random.rand(100,1)
model.fit([x1,x2,x3], y)
where x1,x2,x3 are all sample inputs and y is a sample output. But, he last line model.fit()
throws the error:
TypeError Traceback (most recent call last)
<ipython-input-33-66ea59ad4aed> in <module>()
5 y = np.random.rand(100,1)
6
----> 7 model.fit([x1,x2,x3], y)
9 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
984 except Exception as e: # pylint:disable=broad-except
985 if hasattr(e, "ag_error_metadata"):
--> 986 raise e.ag_error_metadata.to_exception(e)
987 else:
988 raise
TypeError: in user code:
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py:855 train_function *
return step_function(self, iterator)
<ipython-input-17-6a742f71a83b>:2 rmsle *
return K.sqrt(MeanSquaredLogarithmicError(y_true, y_pred))
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/losses.py:506 __init__ **
mean_squared_logarithmic_error, name=name, reduction=reduction)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/losses.py:241 __init__
super(LossFunctionWrapper, self).__init__(reduction=reduction, name=name)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/losses.py:102 __init__
losses_utils.ReductionV2.validate(reduction)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/utils/losses_utils.py:76 validate
if key not in cls.all():
/usr/local/lib/python3.7/dist-packages/tensorflow/python/util/dispatch.py:206 wrapper
return target(*args, **kwargs)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/math_ops.py:1800 tensor_equals
self, other = maybe_promote_tensors(self, other)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/math_ops.py:1202 maybe_promote_tensors
ops.convert_to_tensor(tensor, dtype, name="x"))
/usr/local/lib/python3.7/dist-packages/tensorflow/python/profiler/trace.py:163 wrapped
return func(*args, **kwargs)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py:1566 convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/constant_op.py:339 _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/constant_op.py:265 constant
allow_broadcast=True)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/constant_op.py:283 _constant_impl
allow_broadcast=allow_broadcast))
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/tensor_util.py:457 make_tensor_proto
_AssertCompatible(values, dtype)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/tensor_util.py:337 _AssertCompatible
(dtype.name, repr(mismatch), type(mismatch).__name__))
TypeError: Expected float32, got 'auto' of type 'str' instead.
I haven't encountered this error before and do not understand what's happening. Could someone please help me get rid of this error?
Upvotes: 1
Views: 2336
Reputation: 97
Quotation from keras manual: "Note that all losses are available both via a class handle and via a function handle. The class handles enable you to pass configuration arguments to the constructor (e.g. loss_fn = CategoricalCrossentropy(from_logits=True)), and they perform reduction by default when used in a standalone way (see details below)."
[I made the same error.]
If one decides to use loss class keras.losses.MeanSquaredLogarithmicError, it needs to be instantiated before use in models. On the other hand if one decides to use function keras.losses.mean_squared_logarithmic_error it would be passed to the model itself.
Upvotes: 2
Reputation: 1104
repalce your custom loss with:
def rmsle(y_true, y_pred):
msle = MeanSquaredLogarithmicError()
return K.sqrt(msle(y_true, y_pred))
Upvotes: 3