Reputation: 41
I use google colab to train may dataset. I uploaded my data set to google drive and recall that from google colab. but running the train.py script imply following errors. more precisely i run:
!python3 /content/drive/tensorflow1/models/research/object_detection/train.py --logtostderr --train_dir=/content/drive/tensorflow1/models/research/object_detection/training/ --pipeline_config_path=/content/drive/tensorflow1/models/research/object_detection/training/faster_rcnn_inception_v2_pets.config
and i get these errores:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/usr/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/content/drive/tensorflow1/models/research/object_detection/train.py", line 47, in <module>
import tensorflow as tf
File "/usr/local/lib/python3.6/dist-packages/tensorflow/__init__.py", line 24, in <module>
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/__init__.py", line 49, in <module>
from tensorflow.python import pywrap_tensorflow
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 74, in <module>
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/usr/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory
Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/install_sources#common_installation_problems
for some common reasons and solutions. Include the entire stack trace
above this error message when asking for help.
Do i need to install or upload Cuda9 or Cudnn to google drive first to address theme on colab? How can i pass these errors?
Upvotes: 1
Views: 17092
Reputation: 1861
First, enable GPU on Google Colab Notebook
Go to Menu > Runtime > Change runtime.
Change hardware acceleration to GPU.
How to install CUDA in Google Colab GPU's
Upvotes: -1
Reputation: 406
Since tensorflow-gpu>=1.5.0 requires CUDA 9, you should install the tensorflow-gpu==1.4.0.
pip install --upgrade tensorflow-gpu==1.4
Please refer to below two links.
https://github.com/tensorflow/tensorflow/issues/15604
https://www.tensorflow.org/install/install_sources#tested_source_configurations
Upvotes: 0
Reputation: 3240
Do keep in mind that you have to enable GPU explicitly on a notebook before you could use tensorflow-gpu. I suspect that this step is missing.
In order to enable GPU, try the menu 'runtime->change runtime->hardware accelerator->gpu'
Mark this as solution if that helped so others could benefit.
Upvotes: 8