Akhilesh Sharma
Akhilesh Sharma

Reputation: 150

Accuracy reduced when predicting using tfjs from model trained by ML5

I am using tfjs 1.0.0 on Google Chrome | 76.0.3809.132 (Official Build) (64-bit)

I was using ML5 to train models for image classification in my project. I used the Feature Extractor for transfer learning. I was using mobilenet_v1_0.25 as a base model. I wanted to integrate it such that it performs predictions from chrome extension. I had to use tfjs because I found that ML5 does not run from the background page of the extension. I used tfjs to load models trained by ML5 and then start predicting. However, the prediction accuracy was very low in tfjs as compared to when predicting with the same model in ML5 itself.

I tried reproducing the prediction from ML5 in tfjs by scrapping ML5 Feature Extractor source code, but still, the prediction accuracy is highly decreased while predicting from tfjs.

I am firstly loading the mobilenet and the custom model to make a joint model

load() {
    console.log("ML Data loading..");
    // ! ==========================================
    // ! This is a work around and will only work for default version and alpha values that were used while training model.
    this.mobilenet = await tf.loadLayersModel("https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json");
    const layer = this.mobilenet.getLayer('conv_pw_13_relu');
    // ! ==========================================

    this.mobilenetFeatures = await tf.model({ inputs: this.mobilenet.inputs, outputs: layer.output });
    this.customModel = await tf.loadLayersModel("./model.json");
    this.model.add(this.mobilenetFeatures);
    this.model.add(this.customModel);
}

I am then passing the image to a function that gets top classes after predicting

let result = this.getTopKClasses(this.predict(image), 5)

getTopKClasses(logits, topK) {
    const predictions = logits;
    const values = predictions.dataSync();
    predictions.dispose();
    let predictionList = [];
    for (let i = 0; i < values.length; i++) {
        predictionList.push({ value: values[i], index: i });
    }
    predictionList = predictionList
        .sort((a, b) => {
            return b.value - a.value;
        })
        .slice(0, topK);
    console.log(predictionList);
    let site = predictionList[0];
    let result = { type: 'custom', site: IMAGENET_CLASSES[site.index] }
    console.log('ML Result: Site: %s, Probability: %i%', result.site, (site.value * 100));
    if (site.value > ML_THRESHOLD) {
        return result;
    } else {
        return null;
    }
}

predict(image) {
    const preprocessed = this.imgToTensor(image, [224, 224])
    console.log(preprocessed);
    var result = this.model.predict(preprocessed);
    return result;
}

Helper functions:

imgToTensor(input, size = null) {
    return tf.tidy(() => {
        let img = tf.browser.fromPixels(input);
        if (size) {
            img = tf.image.resizeBilinear(img, size);
        }
        const croppedImage = this.cropImage(img);
        const batchedImage = croppedImage.expandDims(0);
        return batchedImage.toFloat().div(tf.scalar(127)).sub(tf.scalar(1));
    });
}

cropImage(img) {
    const size = Math.min(img.shape[0], img.shape[1]);
    const centerHeight = img.shape[0] / 2;
    const beginHeight = centerHeight - (size / 2);
    const centerWidth = img.shape[1] / 2;
    const beginWidth = centerWidth - (size / 2);
    return img.slice([beginHeight, beginWidth, 0], [size, size, 3]);
};

Upvotes: 2

Views: 472

Answers (1)

Akhilesh Sharma
Akhilesh Sharma

Reputation: 150

In order to predict from a model that was trained using transfer learning (i.e. trained on another, pre-trained model), you need to firstly predict from the base model and then predict from the custom model by passing the base-model predicted tensors to the input of the custom model prediction.

async load(options = {}) {

    // Loading the layer from the base model.
    this.mobilenet = await tf.loadLayersModel(`${BASE_URL}${this.config.version}_${this.config.alpha}_${IMAGE_SIZE}/model.json`);
    const layer = this.mobilenet.getLayer(this.config.layer);

    //Converting the base-model layer to a model.
    this.mobilenetFeatures = await tf.model({ inputs: this.mobilenet.inputs, outputs: layer.output });

    // Loading the custom model that was trained by us.
    this.customModel = await tf.loadLayersModel(CUSTOM_MODEL_FILE_URL);
}

Now to predict from these models:

predict(image) {
    // Converting image to tensor
    const preprocessed = this.imgToTensor(image, [224, 224])

    // * Make predictions about the image firstly, from the Mobilenet (base) Model.
    const embeddings = this.mobilenetFeatures.predict(preprocessed);

    // * Filter predictions from Mobilenet Model using custom trained Model.
    const result = this.customModel.predict(embeddings);

    return result;
}

Upvotes: 1

Related Questions