Reputation: 138
I have 2 tflite models hosted as s3 objects on aws. In my react-typescript app, I am trying to load those models if the browser is opened on mobile. Else, the web app will use other more efficient models.
The Models
interface is as follows:
I have configured the s3 bucket so I can access it from this web app, by changing the CORS configuration. That works. If I go to the network tabs, I see the fetch for the model:
Using Chrome, I can change from mobile to desktop display. The desktop display does not produce any errors. However, the mobile gives me errors that I do not understand.
Ignore the GET
error and the date_created
console.log. They are from another part of my code that is not relevant to this.
I have searched various resources for deploying a tflite to a web app, but have not found anything useful.
------------------EDIT-------------------
I have tried using the method discussed in this github post
But only get the following error (you can ignore the GET error and isMobile console.log):
Upvotes: 6
Views: 1977
Reputation: 138
The solution is that the s3 objects were not uploaded correctly. The script that created the tflite model did not fully created the tflite models. It only created an empty file. I fixed the script. Here is the code that worked:
import tensorflow as tf
model = tf.keras.models.load_model('model/R_keypoint_classifier_final.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
r_file = open("tflite_models/R_converted_model.tflite", "wb")
r_file.write(tflite_model)
model = tf.keras.models.load_model('model/L_keypoint_classifier_final.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
l_file = open("tflite_models/L_converted_model.tflite", "wb")
l_file.write(tflite_model)
After that, I just added the files to the s3 bucket and used the code with the setWasmPath
function.
Upvotes: 0
Reputation: 2742
Under the hood, the Tensorflow TFLite API uses WASM (WebAssembly). By default, it will try to load the relevant WASM files from the directory where the bundled JS file lives. This error indicates the files, and therefore the WASM Modules, could not be found. To address this issue, the path where the WASM files are located need to be configured using tflite.setWasmPath
prior to trying to load the model:
tflite.setWasmPath(
'https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/'
);
Alternatively, if you want to serve the WASM files from your own server, you can copy them to the public
directory in your project:
mkdir -p public/tflite_wasm
cp -rf node_modules/@tensorflow/tfjs-tflite/dist/tflite_web* public/tflite_wasm
Then set the path accordingly:
tflite.setWasmPath('tflite_wasm/');
Regarding the error you're seeing after adding the setWasmPath
as detailed in the Github issue and my initial response, based on the message "Failed to create TFLiteWebModelRunner: INVALID ARGUMENT" and looking at the source, the error is related to the parameter model
, which is the model (s3 path) being supplied.
Based on the image you provided showing network activity, it looks like the R_converted_model.tflite
is downloaded successfully, but L_converted_model.tflite
is not in that list. Without having access to the model files to fully reproduce the issue, I would say first verify the L_converted_model.tflite
file exists at that path in S3. I was able to reproduce the error you're seeing in this codepen demo by modifying the model path to a non-existent file:
If the file does exist at that location, I would evaluate the model file itself by downloading it locally and trying to load it with the Tensorflow API, to identify if there is an issue with the file.
Upvotes: 6