Reputation: 1120
When using a hash table lookup with tf.contrib.Dataset.map()
, it fails with the following error:
TypeError: In op 'hash_table_Lookup', input types ([tf.string, tf.string, tf.int32]) are not compatible with expected types ([tf.string_ref, tf.string, tf.int32])
Code to reproduce:
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
initializer = tf.contrib.lookup.KeyValueTensorInitializer(
['one', 'two', 'three'], [1, 2, 3])
hash_table = tf.contrib.lookup.HashTable(initializer, -1)
tensor = tf.convert_to_tensor(['one', 'two', 'three'])
dataset = tf.contrib.data.Dataset.from_tensor_slices(tensor)
dataset = dataset.map(lambda k: hash_table.lookup(k))
It complains about tf.string_ref
and tf.string
being incompatible.
It's strange that it expects a tf.string_ref
and not a tf.string
. Does anyone know why this is the case and what I can do about it?
The issues is related to table_ref
being tf.string_ref
here.
Upvotes: 0
Views: 536
Reputation: 126154
This is a bug that was fixed in TensorFlow 1.3. If you are using TensorFlow 1.2, the following workaround should work:
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
# Use internal library implementation of `lookup_ops` in TensorFlow 1.2.
from tensorflow.python.ops import lookup_ops
initializer = lookup_ops.KeyValueTensorInitializer(
['one', 'two', 'three'], [1, 2, 3])
hash_table = lookup_ops.HashTable(initializer, -1)
tensor = tf.convert_to_tensor(['one', 'two', 'three'])
dataset = tf.contrib.data.Dataset.from_tensor_slices(tensor)
dataset = dataset.map(lambda k: hash_table.lookup(k))
Up until TensorFlow 1.2, the tf.contrib.lookup
library used "reference types" to represent the lookup tables, whereas in the internal library (used to implement tf.contrib.lookup
from 1.3 onwards) the more modern and compatible "resource types" are used.
Upvotes: 1