lemon
lemon

Reputation: 747

How convert numpy array to JSON for Google Cloud ML?

I have a numpy array (X_test) for testing my model in cloud-ml. For online prediction it is necessary to convert it to JSON format.

My numpy array has the next format:

[[    0     0     0 ...  7464  1951  2861]
 [    0     0     0 ...  3395  1996  4999]
 [    0     0     0 ...  5294  9202 17867]
 ...
 [    0     0     0 ...  3506   977  7818]
 [    0     0     0 ...  1421    75   137]
 [    0     0     0 ... 12857 12686  2928]]

I use the next code for converting it to JSON:

import json
b = X_test.tolist()
json_file = "file.json" 
json.dump(b, codecs.open(json_file, 'w', encoding='utf-8'), sort_keys=True, indent=4)

After this I use Google Cloud SDK Shell for cloud prediction and enter the next command:

gcloud ml-engine predict --model keras_model --version v1 --json-instances file.json

However, I get the next error:

ERROR: (gcloud.ml-engine.predict) Input instances are not in JSON format. See "gcloud ml-engine predict --help" for details.

As I understood, I incorrectly converted numpy to JSON for cloud-ml.

How correctly to convert numpy into JSON for avoiding this error?


UPD: Here's the code that helped me solve this problem:

import json
b = X_test.tolist()
json_file = "file.json"

with open(json_file, 'w', encoding='utf-8') as f:
    for i in b:
        instance = {"input": i}
        json.dump(instance, f , sort_keys=True)
        f.write("\n")

Upvotes: 4

Views: 809

Answers (3)

dwjbosman
dwjbosman

Reputation: 966

The documentation:

https://cloud.google.com/ml-engine/docs/tensorflow/online-predict#formatting_instances_as_json_strings

Maybe use something like this:

import numpy as np
import codecs

X_test = np.zeros((5,5))
print(X_test)

import json
b = X_test.tolist()
json_file = "file.json"

f = codecs.open(json_file, 'w', encoding='utf-8')

for i in range(0,len(b)):
    row = b[i]
    instance = {"values" : row, "key": i}
    json.dump(instance, f , sort_keys=True)
    f.write("\n")

file.json becomes:

{"key": 0, "values": [0.0, 0.0, 0.0, 0.0, 0.0]}
{"key": 1, "values": [0.0, 0.0, 0.0, 0.0, 0.0]}
{"key": 2, "values": [0.0, 0.0, 0.0, 0.0, 0.0]}
{"key": 3, "values": [0.0, 0.0, 0.0, 0.0, 0.0]}
{"key": 4, "values": [0.0, 0.0, 0.0, 0.0, 0.0]}

Upvotes: 4

You can check your JSON converted data with type function (In your case: type(b)) before writing to your file for ensure. And then just using simple code for writing json:

    import io
    json.dump(b, io.open(json_file, 'w', encoding='utf-8'))

Upvotes: 0

Chandu
Chandu

Reputation: 2129

For online prediction, the json needs to be one instance per line.

e.g

    39,State-gov,77516,Bachelors,13,Never-married,Adm-clerical,Not-in-family,White,Male,2174,0,40,United-States,<=50K
50,Self-emp-not-inc,83311,Bachelors,13,Married-civ-spouse,Exec-managerial,Husband,White,Male,0,0,13,United-States,<=50K
38,Private,215646,HS-grad,9,Divorced,Handlers-cleaners,Not-in-family,White,Male,0,0,40,United-States,<=50K
53,Private,234721,11th,7,Married-civ-spouse,Handlers-cleaners,Husband,Black,Male,0,0,40,United-States,<=50K

you can refer https://github.com/GoogleCloudPlatform/cloudml-samples

Upvotes: 1

Related Questions