RyanB
RyanB

Reputation: 13

Keras LSTM TypeError messages

I am trying to understand how to use keras for supply chain forecasting and i keep getting errors that i can't find help for elsewhere. I've tried to do similar tutorials; sunspot forecasting tutorial, pollution multivariate tutorial etc but i'm still not understanding how the input_shape argument works or how to organize my data to get it to be accepted by keras.

My dataset is a single time series describing the number of products we sold every month. I took that single time series, 107 months, and turned it into a 30 row, 77 column data set. I created a training set and test set from that.

but no matter what i do i can't get past even just creating a model without some kind of error.

Keras v#: 1.2.0

C:\Users\Ryan.B>python -c "import keras; print(keras.version)"

Using TensorFlow backend.

1.2.0

Python Version: 3.5.4

Here's the code and respective errors i'm getting.

model = Sequential()
model.add(LSTM(units=64, input_shape=(77, 1), output_dim=1))

C:\Python35\lib\site-packages\keras\backend\tensorflow_backend.py in concatenate(tensors, axis)
   1219         try:
-> 1220             return tf.concat_v2([to_dense(x) for x in tensors], axis)
   1221         except AttributeError:

AttributeError: module 'tensorflow' has no attribute 'concat_v2'

During handling of the above exception, another exception occurred:

TypeError                                 Traceback (most recent call last)
<ipython-input-21-94f09519ff46> in <module>()
      1 model = Sequential()
----> 2 model.add(LSTM(input_shape=(77, 1), output_dim = 1))
      3 #model.add(Dense(10, activation = 'relu'))
      4 #model.add(Dense(1, activation = 'softmax'))

C:\Python35\lib\site-packages\keras\models.py in add(self, layer)
    292                 else:
    293                     input_dtype = None
--> 294                 layer.create_input_layer(batch_input_shape, input_dtype)
    295 
    296             if len(layer.inbound_nodes) != 1:

C:\Python35\lib\site-packages\keras\engine\topology.py in create_input_layer(self, batch_input_shape, input_dtype, name)
    396         # and create the node connecting the current layer
    397         # to the input layer we just created.
--> 398         self(x)
    399 
    400     def add_weight(self, shape, initializer, name=None,

C:\Python35\lib\site-packages\keras\engine\topology.py in __call__(self, x, mask)
    541                                      '`layer.build(batch_input_shape)`')
    542             if len(input_shapes) == 1:
--> 543                 self.build(input_shapes[0])
    544             else:
    545                 self.build(input_shapes)

C:\Python35\lib\site-packages\keras\layers\recurrent.py in build(self, input_shape)
    761                                       self.W_f, self.U_f, self.b_f,
    762                                       self.W_o, self.U_o, self.b_o]
--> 763             self.W = K.concatenate([self.W_i, self.W_f, self.W_c, self.W_o])
    764             self.U = K.concatenate([self.U_i, self.U_f, self.U_c, self.U_o])
    765             self.b = K.concatenate([self.b_i, self.b_f, self.b_c, self.b_o])

C:\Python35\lib\site-packages\keras\backend\tensorflow_backend.py in concatenate(tensors, axis)
   1220             return tf.concat_v2([to_dense(x) for x in tensors], axis)
   1221         except AttributeError:
-> 1222             return tf.concat(axis, [to_dense(x) for x in tensors])
   1223 
   1224 

C:\Python35\lib\site-packages\tensorflow\python\ops\array_ops.py in concat(values, axis, name)
   1041       ops.convert_to_tensor(axis,
   1042                             name="concat_dim",
-> 1043                             dtype=dtypes.int32).get_shape(
   1044                             ).assert_is_compatible_with(tensor_shape.scalar())
   1045       return identity(values[0], name=scope)

C:\Python35\lib\site-packages\tensorflow\python\framework\ops.py in convert_to_tensor(value, dtype, name, preferred_dtype)
    674       name=name,
    675       preferred_dtype=preferred_dtype,
--> 676       as_ref=False)
    677 
    678 

C:\Python35\lib\site-packages\tensorflow\python\framework\ops.py in internal_convert_to_tensor(value, dtype, name, as_ref, preferred_dtype)
    739 
    740         if ret is None:
--> 741           ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
    742 
    743         if ret is NotImplemented:

C:\Python35\lib\site-packages\tensorflow\python\framework\constant_op.py in _constant_tensor_conversion_function(v, dtype, name, as_ref)
    111                                          as_ref=False):
    112   _ = as_ref
--> 113   return constant(v, dtype=dtype, name=name)
    114 
    115 

C:\Python35\lib\site-packages\tensorflow\python\framework\constant_op.py in constant(value, dtype, shape, name, verify_shape)
    100   tensor_value = attr_value_pb2.AttrValue()
    101   tensor_value.tensor.CopyFrom(
--> 102       tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape))
    103   dtype_value = attr_value_pb2.AttrValue(type=tensor_value.tensor.dtype)
    104   const_tensor = g.create_op(

C:\Python35\lib\site-packages\tensorflow\python\framework\tensor_util.py in make_tensor_proto(values, dtype, shape, verify_shape)
    372       nparray = np.empty(shape, dtype=np_dt)
    373     else:
--> 374       _AssertCompatible(values, dtype)
    375       nparray = np.array(values, dtype=np_dt)
    376       # check to them.

C:\Python35\lib\site-packages\tensorflow\python\framework\tensor_util.py in _AssertCompatible(values, dtype)
    300     else:
    301       raise TypeError("Expected %s, got %s of type '%s' instead." %
--> 302                       (dtype.name, repr(mismatch), type(mismatch).__name__))
    303 
    304 

TypeError: Expected int32, got "<"tf.Variable 'lstm_3_W_i:0' shape=(1, 1) dtype=float32_ref">" of type 'Variable' instead.

Any help resolving these errors, and understanding about how input_shape and output_dim work would be appreciated!

Eventually I want to start to use things like monthly marketing budget/metrics and sales team metrics as external regressors for multivariate forecasting but one step at a time. Thank you for your time and input!

Upvotes: 0

Views: 687

Answers (1)

desertnaut
desertnaut

Reputation: 60321

You should really upgrade to Keras 2; in Keras 1.x, units is not even a valid argument, hence your error:

import keras
from keras.models import Sequential
from keras.layers import LSTM
keras.__version__
# '2.2.4'

Your case still gives an error in Keras 2, albeit a different one:

model = Sequential()
model.add(LSTM(units=64, input_shape=(77, 1), output_dim=1))
[...]
TypeError: For the `units` argument, the layer received both the legacy keyword argument `output_dim` and the Keras 2 keyword argument `units`. Stick to the latter!

Omitting the legacy output_dim argument, as the message advises, we get it to work:

model = Sequential()
model.add(LSTM(units=64, input_shape=(77, 1)))

model.summary()
# result:
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 64)                16896     
=================================================================
Total params: 16,896
Trainable params: 16,896
Non-trainable params: 0
_________________________________________________________________

So, I seriously suggest you upgrade to Keras 2 (I highly doubt that Keras 1.x works OK with Tensorflow 1.2), and open a new question if you still have issues...

Upvotes: 1

Related Questions