Reputation: 2061
Is there any way to perform the equivalent of gcloud ml-engine local predict --model-dir=$MODEL_DIR --json-instances=$JSON_INSTANCE
within a jupyter notebook?
Upvotes: 0
Views: 52
Reputation: 8389
Let me give a quick answer; one that may be updated at some point in the future. Basically, this answer should apply. For example:
import json
from tensorflow.contrib import predictor
def columnarize(instancse):
out = {}
for instance in instances:
for k, v in instance.items():
out.setdefault(k, []).append(v)
return out
def mapify(outputs, fetch_tensors):
return dict(zip(fetch_tensors.iterkeys(), outputs))
def rowify(columns):
out = []
num_instances = len(next(columns.itervalues()))
for row in range(num_instances):
out.append({
name: output[row, ...].tolist()
for name, output in columns.iteritems()
})
return out
instances = [
{"x": [6.4, 3.2, 4.5, 1.5], "y": -1},
{"x": [5.8, 3.1, 5.0, 1.7], "y": 5},
]
predict_fn = predictor.from_saved_model(export_dir)
outputs = predict_fn(columnarize(instances))
predictions = rowify(mapify(outputs, predictor._fetch_tensors))
print(predictions)
Upvotes: 2