RodR
RodR

Reputation: 31

How to use local Coral USB TPU with Google Colab (instead of Cloud TPU)

I have a USB TPU and would like to use it as LOCAL RUNTIME in Google Colab. I was not able to find any resources on this topic. You can use a local Runtime (local Jupyter) and it is explained here : https://research.google.com/colaboratory/local-runtimes.html

Do I need to install all the TPU libraries in my local Jupyter and then connect to local Jupyter as local runtime to start using my USB TPU in Colab?

Upvotes: 3

Views: 1104

Answers (2)

Christopher Lazok
Christopher Lazok

Reputation: 1

Same Here.

  1. Try Docker or Podman
  2. Try Jupyter instead of Colab (your losing cloud kindof)
  3. Manual Initialization and Setting of the TPU https://www.tensorflow.org/guide/tpu (making your environment think its using a specific paid cloud service) You will lose speed, and 1 coral is probably no better than googles free TPU, granted you lose all private data.
  4. Abandon Colab; go with an alternative like HuggingFace/Gradio.

Many coral Dev board users are having this issue. Creating a 'swarm' of RasPI TPU for 'training' has been an Environment issue.

https://hub.docker.com/r/tensorflow/tensorflow

Start a GPU container, using the Python interpreter.

$ docker run -it --rm -v $(realpath ~/notebooks):/tf/notebooks -p 8888:8888 tensorflow/tensorflow:latest-jupyter

Run a Jupyter notebook server with your own notebook directory (assumed here to be ~/notebooks). To use it, navigate to localhost:8888 in your browser.

Set the interpreter and manually set the TPU. (No detailed instructions posted as of 10-2023)

Upvotes: 0

Nam Vu
Nam Vu

Reputation: 1757

I'm not familiar with Google Colab, but looks like it allows you to expose your model on your hardware. You'll then need to locate your model in order to run inference with it. There are multiple ways that you can choose to run it which are all listed here: https://coral.withgoogle.com/docs/edgetpu/inference/

Upvotes: 1

Related Questions