Reputation:
Is there a way to use a TPU v3 instead of a TPU v2 in Google Colab Pro?
Unfortunately I get an error message Compilation failure: Ran out of memory in memory space hbm. Used 8.29G of 7.48G hbm. Exceeded hbm capacity by 825.60M.
with the TPU v2, which I no longer receive with a TPU v3. Because a TPU v3 has more memory.
Does anyone know a possibility / option?
With this I start the TPU
try:
tpu = tf.distribute.cluster_resolver.TPUClusterResolver() # TPU detection
print('Running on TPU ', tpu.cluster_spec().as_dict()['worker'])
except ValueError:
raise BaseException('ERROR: Not connected to a TPU runtime; please see the previous cell in this notebook for instructions!')
tf.config.experimental_connect_to_cluster(tpu)
tf.tpu.experimental.initialize_tpu_system(tpu)
#tpu_strategy = tf.distribute.experimental.TPUStrategy(tpu)
strategy = tf.distribute.TPUStrategy(tpu)
Upvotes: 3
Views: 5563
Reputation: 881
Short answer would be no. There isn't a way to specify specific TPU version you want. Kaggle though provides v3-8 TPUs I believe (which may be subject to change as well since it's free). Also, as the other answer points out, you can spin up a paid Cloud TPU yourself as well, for which you can specify the specific hardware.
Upvotes: 4
Reputation: 466
As far as I know, the free version of Colab does not provide any way to choose neither GPU nor TPU. As well as the pro version, though.
You can buy specific TPU v3 from CloudTPU for $8.00/hour if really need to.
Quote from Colab FAQ:
There is no way to choose what type of GPU you can connect to in Colab at any given time. Users who are interested in more reliable access to Colab’s fastest GPUs may be interested in Colab Pro.
Upvotes: 1