Integration
Integration

Reputation: 347

tensorflowjs loading re-trained coco-ssd model - not working in browser

I used the Python models/research/object-detection API to re-train the coco-ssd with my own dataset. I have saved the model & the model works in ipython notebook. I used tfjs_converter to convert it tensorflowjs_converter --input_format=tf_saved_model --output_format=tensorflowjs --output_node_names='detection_boxes,detection_classes,detection_scores,num_detections' --saved_model_tags=serve ./saved_model ./web_model

Test 1; my code

    image.src = imageURL;
    var img;
    const runButton = document.getElementById('run');
    runButton.onclick = async () => {
        console.log('model start');
        const model = await modelPromise;

        console.log('model loaded');
        const zeros = tf.zeros([1, 224, 224, 3]);

        const batched = tf.tidy(() => {
        if (!(image instanceof tf.Tensor)) {
            img = tf.fromPixels(image);
         }
      // Reshape to a single-element batch so we can pass it to executeAsync.
          return img.expandDims(0);
       });
       console.log('model loaded - now predict .. start');
       const result = await model.executeAsync(batched) ;
       console.log('model loaded - now predict - ready'); //  Error seen 
       batched.dispose();
       tf.dispose(result);

    model loaded - now predict .. start ( i tried chaning the model to Coco-ssd model same error)
   tensor_array.ts:116 Uncaught (in promise) Error: TensorArray : Could not write to TensorArray index 0,
          because the value dtype is int32, but TensorArray dtype is float32.
      at e.write (tensor_array.ts:116)
      at tensor_array.ts:162
      at Array.forEach (<anonymous>)
      at e.writeMany (tensor_array.ts:162)
      at e.scatter (tensor_array.ts:252)
      at control_executor.ts:127
      at callbacks.ts:17
      at Object.next (callbacks.ts:17)
      at callbacks.ts:17```

     Test 2; ---- using tfjs-model/coco-ssd/demo ----------------------------------
      did yarn , yarn watch  

    I replaced the coo-ssd model which works correctly, with my re-trained model  (only switched the models)
      //BASE_PATH = "https://storage.googleapis.com/tfjs-models/savedmodel/";
        BASE_PATH = "http://localhost:1234/web_model/";
    //this.modelPath = "" + BASE_PATH + this.getPrefix(e) + 
    "/tensorflowjs_model.pb", this.weightPath = "" + BASE_PATH + 
    this.getPrefix(e) + "/weights_manifest.json";
    ``this.modelPath = "" + BASE_PATH +  "tensorflowjs_model.pb", 
      this.weightPath = "" +BASE_PATH + "weights_manifest.json";``

    I get an error
    io_utils.ts:116 Uncaught (in promise) RangeError: byte length of float32Array should be a multiple of 4
        at new Float32Array (<anonymous>)
        at o (io_utils.ts:116)
        at Object.decodeWeights (io_utils.ts:79)
        at e.<anonymous> (frozen_model.ts:109)
        at exports_regularizers.ts:47
        at Object.next (exports_regularizers.ts:47)
        at s (exports_regularizers.ts:47)```


    model loaded - now predict .. start ( i tried chaning the model to Coco-ssd model same error)
    ```tensor_array.ts:116 Uncaught (in promise) Error: TensorArray : Could not write to TensorArray index 0,
          because the value dtype is int32, but TensorArray dtype is float32.
        at e.write (tensor_array.ts:116)
        at tensor_array.ts:162
        at Array.forEach (<anonymous>)
        at e.writeMany (tensor_array.ts:162)
        at e.scatter (tensor_array.ts:252)
        at control_executor.ts:127```
        at callbacks.ts:17
        at Object.next (callbacks.ts:17)
        at callbacks.ts:17


Upvotes: 3

Views: 2245

Answers (2)

Dhrumil
Dhrumil

Reputation: 117

Try these conversion parameters. They worked for me after retraining with mobilenet_v1 output_node_names="Postprocessor/ExpandDims_1,Postprocessor/Slice"

https://github.com/tensorflow/tfjs-models/tree/master/coco-ssd

Upvotes: 1

edkeveked
edkeveked

Reputation: 18381

The error has to do with the tensor image you're using for your prediction.

tf.fromPixel creates a tensor image with value ranging from 0 to 255 with dtype int. Since your model is waiting for a tensor of dtype float32, you can either cast the type to float or change the tensor value to fit between 0 and 1

  • Casting to float32
img = tf.fromPixels(image).cast('float32')
  • shifting value to fit between 0 and 1
img = tf.fromPixels(image).div(256)

Upvotes: 2

Related Questions