Reputation: 51
In my current project I'm using machine learning on the Raspberry Pi for sensor fusion. Since I heard about the release of Tensorflow Lite I'm really interested to deploy and use it to run Lite models on the platform.
On the Tensorflow website are hints for Android and iOS, but I couldn't find any hints about any other platforms. Is there a (WIP) installation/compile guide out to bring TF Lite to the Raspi?
TIA
Upvotes: 4
Views: 3722
Reputation: 61
I would suggest next links:
You have to remember, if you use interpreter only you have to follow a little bit different logic.
# Initiate the interpreter
interpreter = tf.lite.Interpreter(PATH_TO_SAVED_TFLITE_MODEL)
# Allocate memory for tensors
interpreter.allocate_tensors()
# Get input and output tensors
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Add a batch dimension if needed (data_tensor - your data input)
input_data = tf.extend.dims(data_tensor, axis=0)
# Predict
interpreter.set_tensor(input_details[0]['index'], data_tensor)
interpreter.invoke()
# Obtain results
predictions = interpreter.get_tensor(output_details[0]['index'])
Upvotes: 0
Reputation: 639
You can install TensorFlow PIP on Raspberry pi with "pip install tensorflow" however, if you want only TFLite you can build a smaller pip that has only the tflite interpreter (you can then do conversion on another big machine).
Info on how to do it is here: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/tools/pip_package
Then, you can use it. Here is an example of how you might use it!
import tflite_runtime as tflr
interpreter = tflr.lite.Interpreter(model_path="mobilenet_float.tflite")
interpreter.allocate()
input = interpreter.get_input_details()[0]
output = interpreter.get_input_details()[0]
cap = cv2.VideoCapture(0) # open 0th web camera
while 1:
ret, frame = cap.read()
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
frame = cv2.resize(frame, input.shape[2],input.shape[1])
frame = np.reshape(im, input.shape).astype(np.float32)/128.0-1.0
interpreter.set_tensor(input["index"], frame)
interpreter.invoke()
labels = interpreter.get_tensor(output["index"])
top_label_index = np.argmax(labels, axis=-1)
Hope this helps.
Upvotes: 0
Reputation: 11
@all, if you are still in the trials to make tensorflow lite running on Raspberry Pi 3, my "pull-request" may be useful. Please look at https://github.com/tensorflow/tensorflow/pull/24194.
Following the steps, 2 apps (label_image and camera) can be running on Raspberry Pi 3.
Best,
--Jim
Upvotes: 1
Reputation: 14910
There is a very small section on Raspberry PI in the TFLite docs at https://www.tensorflow.org/mobile/tflite/devguide#raspberry_pi. That section links to this GitHub doc with instructions for building TFLite on Raspberry PI - tensorflow/rpi.md.
There is no official demo app yet, but the first location says one is planned. It will probably be shared at that same location when ready (that is where the Android and iOS demo apps are described).
Upvotes: 0