Reputation: 31
I have a USB TPU and would like to use it as LOCAL RUNTIME in Google Colab. I was not able to find any resources on this topic. You can use a local Runtime (local Jupyter) and it is explained here : https://research.google.com/colaboratory/local-runtimes.html
Do I need to install all the TPU libraries in my local Jupyter and then connect to local Jupyter as local runtime to start using my USB TPU in Colab?
Upvotes: 3
Views: 1104
Reputation: 1
Same Here.
Many coral Dev board users are having this issue. Creating a 'swarm' of RasPI TPU for 'training' has been an Environment issue.
https://hub.docker.com/r/tensorflow/tensorflow
Start a GPU container, using the Python interpreter.
$ docker run -it --rm -v $(realpath ~/notebooks):/tf/notebooks -p 8888:8888 tensorflow/tensorflow:latest-jupyter
Run a Jupyter notebook server with your own notebook directory (assumed here to be ~/notebooks). To use it, navigate to localhost:8888 in your browser.
Set the interpreter and manually set the TPU. (No detailed instructions posted as of 10-2023)
Upvotes: 0
Reputation: 1757
I'm not familiar with Google Colab, but looks like it allows you to expose your model on your hardware. You'll then need to locate your model in order to run inference with it. There are multiple ways that you can choose to run it which are all listed here: https://coral.withgoogle.com/docs/edgetpu/inference/
Upvotes: 1