filipe
filipe

Reputation: 505

VertexAI Endpoint - Unable to coerce value

I have a Custom Class that turns a Pandas Dataframe into a dataset consisting of a list of lists of 7-day periods with a batch size of 32. Giving some context, this is the code:

def make_dataset(self, dataframe):
    grouped_df = list()
    return tf.data.Dataset.from_generator(
        lambda: self.make_generator_dataset(dataframe),
        output_shapes=([None, self.input_width, self.num_input_features], [None, self.label_width, self.num_label_features]), output_types = (tf.float32, tf.float32)
    )

  def make_generator_dataset(self, dataframe):
    first_it = True
    grouped_df = dataframe.groupby(level=[0,1])
    inputs_batch_list = []
    labels_batch_list = []
    for index, campaign_period_data in grouped_df:
      labels = self.process_label(campaign_period_data)
      inputs = self.process_input(campaign_period_data)

      inputs_batch_list.append(inputs)
      labels_batch_list.append(labels)
      if len(inputs_batch_list) == self.batch_size:
        inputs_batch = tf.concat(inputs_batch_list, 0)
        labels_batch = tf.concat(labels_batch_list, 0)
        yield inputs_batch, labels_batch
        inputs_batch_list = []
        labels_batch_list = []

Whenever I feed it into the endpoint containing the model, I get the following errors:

response = prediction_client.predict(endpoint=endpoint_name, instances = wide_window.test)

ValueError: Unable to coerce value: <tf.Tensor: shape=(32, 7, 69), dtype=float32, numpy=
array([[[ 0.00000000e+00,  0.00000000e+00,  0.00000000e+00, ...,

response = prediction_client.predict(endpoint=endpoint_name, instances = [wide_window.test])

ValueError: Unable to coerce value: <FlatMapDataset element_spec=(TensorSpec(shape=(None, 7, 69), dtype=tf.float32, name=None), TensorSpec(shape=(None, 1, 1), dtype=tf.float32, name=None))>

I've also found some difficulty turning this FlatMapDataset into a TFRecord, which I think the endpoint can interpret. What could be the solution here?

I was expecting for the endpoint to return the predictions since if I do it on a local test model, it works perfectly.

fastLoad = True
if fastLoad:
    lstm_model = tf.keras.models.load_model('model/my_model')

preds_test = lstm_model.predict(wide_window.test)

Works perfectly on a loaded model, feeding the same dataset into the endpoint does not.

Upvotes: 0

Views: 319

Answers (0)

Related Questions