Reputation: 89
I'm just starting with Tensorflow and when I call m.fit(input_fn=lambda: self.input_fn(train_data), steps=train_steps)
, then I receive the following error.
File "/Library/Python/2.7/site-packages/tensorflow/contrib/layers/python/layers/feature_column_ops.py", line 161, in _input_from_feature_columns
transformed_tensor = transformer.transform(column)
File "/Library/Python/2.7/site-packages/tensorflow/contrib/layers/python/layers/feature_column_ops.py", line 882, in transform
feature_column.insert_transformed_feature(self._columns_to_tensors)
File "/Library/Python/2.7/site-packages/tensorflow/contrib/layers/python/layers/feature_column.py", line 991, in insert_transformed_feature
self.sparse_id_column.insert_transformed_feature(columns_to_tensors)
File "/Library/Python/2.7/site-packages/tensorflow/contrib/layers/python/layers/feature_column.py", line 572, in insert_transformed_feature
name="lookup")
File "/Library/Python/2.7/site-packages/tensorflow/contrib/lookup/lookup_ops.py", line 1018, in index_table_from_tensor
"integer" if dtype.is_integer else "non-integer", keys.dtype))
ValueError: Expected non-integer, got <dtype: 'int32'>.
In the feature columns that I pass to fit()
, there are only int32
and int64
, but that should not be the problem, should it?
Upvotes: 0
Views: 2361
Reputation: 68
I think it could happen that you use categorical features with tf.SparseTensor
but your feature columns contain int32
.
In this case just convert your integer columns into string, for example like this:
# using Pandas
for f in categorical_features:
df_train[f] = df_train[f].astype(str)
df_test[f] = df_test[f].astype(str)
Upvotes: 2