user293895
user293895

Reputation: 1527

Finding the correct JSON format for Google AI Platform inference

I am trying to run predictions on Googles Universal Sentence Encoder using gclouds ai-platform local predict command. My command looks like so:

gcloud ai-platform local predict --model-dir=/Users/x/Downloads/universal-sentence-encoder/ --json-instances=instances.json --verbosity debug

And instances.json looks like so:

{"inputs": ["Hello World."]}

I get the following back from gcloud:

cloud.ml.prediction.prediction_utils.PredictionError: Failed to run the provided model: Exception during running the graph: Cannot feed value of shape (1, 1) for Tensor 'serving_default_inputs:0', which has shape '(?,)' (Error code: 2)

I believe my input format is wrong, but I am failing to find the correct format. Does anyone know how to inspect a saved model to find out its correct input format?

Upvotes: 0

Views: 511

Answers (2)

MorganR
MorganR

Reputation: 81

Svetlana's answer is correct. You can batch multiple requests by repeating the line, eg:

{"inputs": "Hello World."}
{"inputs": "Hello Mars."}

Upvotes: 2

Svetlana
Svetlana

Reputation: 61

You probably need to provide a string instead of a list. The following input may work:

{"inputs": "Hello World."}

Upvotes: 2

Related Questions